Nothing Special   »   [go: up one dir, main page]

WO2024109357A1 - 图像处理方法、装置、设备、存储介质及程序产品 - Google Patents

图像处理方法、装置、设备、存储介质及程序产品 Download PDF

Info

Publication number
WO2024109357A1
WO2024109357A1 PCT/CN2023/123514 CN2023123514W WO2024109357A1 WO 2024109357 A1 WO2024109357 A1 WO 2024109357A1 CN 2023123514 W CN2023123514 W CN 2023123514W WO 2024109357 A1 WO2024109357 A1 WO 2024109357A1
Authority
WO
WIPO (PCT)
Prior art keywords
optimized
value
image
parameter
difference information
Prior art date
Application number
PCT/CN2023/123514
Other languages
English (en)
French (fr)
Inventor
蒋心为
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2024109357A1 publication Critical patent/WO2024109357A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries

Definitions

  • the present application relates to the field of image processing technology, and in particular to an image processing method, device, equipment, storage medium and program product.
  • image technology is becoming more and more extensive, for example, image technology is applied to games, animations or advertisements.
  • the objects in the image need to be rendered.
  • the process is: setting material parameters of the objects in the image, rendering the objects in the image according to the material parameters, and thus obtaining the image.
  • the material parameters of the objects in the image are set manually, resulting in low rendering efficiency and accuracy.
  • the embodiments of the present application provide an image processing method, apparatus, device, storage medium and program product, which can solve the technical problem of manually setting the value of material parameters, resulting in low rendering efficiency and accuracy.
  • an image processing method including:
  • the initial value is adjusted according to the difference information to obtain a target value of the material parameter to be optimized, and the target object is rendered according to the target value to obtain a rendered image of the target object.
  • an embodiment of the present application further provides an image processing device, including:
  • a parameter acquisition module used to acquire the material parameters to be optimized of the target object and to acquire the initial values of the material parameters to be optimized;
  • An image acquisition module used to acquire a captured image of the target object and determine shooting parameters of the captured image
  • An object rendering module used for rendering the target object according to the initial value and the shooting parameter to obtain a rendered image of the target object
  • an information determination module configured to determine difference information between the rendered image and the captured image based on the rendered image and the captured image
  • a parameter optimization module is used to adjust the initial value according to the difference information to obtain a target value of the material parameter to be optimized, so as to render the target object according to the target value to obtain a rendered image of the target object.
  • an embodiment of the present application further provides an electronic device, including a processor and a memory, wherein the memory stores a computer program, and the processor is used to run the computer program in the memory to implement the image processing method provided in the embodiment of the present application.
  • an embodiment of the present application further provides a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program, and the computer program is suitable for loading by a processor to execute any image processing method provided in the embodiment of the present application.
  • an embodiment of the present application further provides a computer program product, including a computer program, which implements any image processing method provided in the embodiment of the present application when executed by a processor.
  • FIG1 is a schematic diagram of a scene of an image processing process provided by an embodiment of the present application.
  • FIG2 is a schematic diagram of a flow chart of an image processing method provided in an embodiment of the present application.
  • FIG3 is a schematic diagram of a flow chart of another image processing method provided in an embodiment of the present application.
  • FIG4 is a schematic diagram of a process for setting material parameters according to an embodiment of the present application.
  • FIG5 is a schematic diagram of another material parameter setting process provided in an embodiment of the present application.
  • FIG6 is a schematic diagram of a material parameter optimization method provided in an embodiment of the present application.
  • FIG7 is a schematic diagram of the structure of an image processing device provided in an embodiment of the present application.
  • FIG8 is a schematic diagram of the structure of an electronic device provided in an embodiment of the present application.
  • the embodiments of the present application provide an image processing method, apparatus, device, storage medium and program product, wherein the device may be an electronic device, the storage medium may be a computer storage medium, and the program product may be a computer program product.
  • the image processing apparatus may be integrated in an electronic device, and the electronic device may be a server, a terminal or other device.
  • the server can be an independent physical server, a server cluster or distributed system composed of multiple physical servers, or a cloud server that provides basic cloud computing services such as cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communications, middleware services, domain name services, security services, network acceleration services (Content Delivery Network, CDN), as well as big data and artificial intelligence platforms.
  • cloud computing services such as cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communications, middleware services, domain name services, security services, network acceleration services (Content Delivery Network, CDN), as well as big data and artificial intelligence platforms.
  • CDN Network acceleration services
  • multiple servers can form a blockchain, and the servers are nodes on the blockchain.
  • the terminal may be a smart phone, a tablet computer, a laptop computer, a desktop computer, a smart speaker, a smart watch, a virtual reality device (VR) and an augmented reality device (AR), etc., but is not limited thereto.
  • the terminal and the server may be directly or indirectly connected via wired or wireless communication, and this application does not limit this.
  • FIG1 is a scene schematic diagram of an image processing process provided by an embodiment of the present application, including a terminal 110 and a server 120.
  • the server 120 obtains the material parameters to be optimized of the target object, and initializes the material parameters to be optimized to obtain the initial values of the material parameters to be optimized, obtains the captured image of the target object, and determines the shooting parameters of the captured image, renders the target object according to the initial values and the shooting parameters, obtains the rendered image of the target object, determines the difference information between the rendered image and the captured image according to the rendered image and the captured image, optimizes (specifically adjusts) the initial value according to the difference information, and obtains the target value of the material parameter to be optimized.
  • the terminal 110 sends a parameter acquisition request for the target object to the server 120, and the server 120 returns the target value of the material parameter to be optimized to the terminal 110 according to the parameter request, and the terminal 110 renders the target object according to the target value to obtain the rendered image of the target object.
  • the image processing method of the embodiment of the present application can be applied to various scenarios for rendering objects.
  • the image processing method of the embodiment of the present application can be used to render objects in the game, and the objects in the game can be, for example, game props, game characters, or game backgrounds.
  • the image processing method of the embodiment of the present application can also be used to render objects in animations, and the objects in the animations can be characters, animals, or scenic spots in the animations.
  • the term “plurality” in the embodiments of the present application refers to two or more than two.
  • the terms “first” and “second” in the embodiments of the present application are used to distinguish descriptions and should not be understood as implying relative importance.
  • Unreal Engine is a framework composed of various tools needed to write various games. Game developers can directly call various tools in Unreal Engine to quickly create game applications without starting from scratch.
  • the rendering pipeline refers to the overall process of converting data from a 3D scene into a 2D image and finally displaying it on the screen.
  • the rendering pipeline can include the application stage, the geometry stage, and the rasterization stage.
  • the application stage refers to the process of obtaining scene data, which includes vertex three-dimensional coordinates, light sources, camera positions, view cones, and the materials of each object.
  • the geometry stage includes the process of calculating the scene data, including coordinate changes, vertex shading, projection changes, and clipping.
  • the rasterization stage includes the process of converting 3D continuous objects into discrete screen pixels and determining the color of screen pixels based on lighting and material parameters.
  • the initial value of the material parameter to be optimized and the shooting parameters of the target object can be input into the rendering pipeline for rendering.
  • a rendered image of the target object is obtained.
  • the description will be made from the perspective of the image processing device.
  • the following detailed description will be given with the image processing device integrated in the terminal, that is, with the terminal as the execution subject.
  • FIG. 2 is a flow chart of an image processing method provided by an embodiment of the present application.
  • the image processing method is executed by an electronic device, for example, by the terminal 110 in FIG. 1, or by the server 120, or by the interaction of the two.
  • the image processing method may include:
  • the target object may refer to an object existing in reality, which may be a person, a pet, a plant, or various products.
  • the target object may be car paint or stone.
  • the target object can exist in the renderer in the form of a model, which refers to a graphic composed of points, lines or surfaces without information such as color and texture.
  • the game character corresponding to the game player in the game can be obtained.
  • Material parameters refer to parameters indicating visual properties of a target object, including at least one of color, texture, smoothness, transparency, reflectivity, refractive index, and luminosity. It should be understood that the material parameters of different target objects may be different or the same.
  • the material parameters of the target object may refer to the transparency and color of the car paint.
  • the material parameters of the target object may refer to the smoothness and color of the stone.
  • Material parameters can be represented in two ways: one is to represent the material parameters of the target object through a map, and the other is to represent the material parameters of the target object through a specific value.
  • the initial value of the material parameter to be optimized when the material parameter is represented by a specific numerical value, the initial value of the material parameter to be optimized may be the initial value of the numerical value.
  • the initial value of the material parameter to be optimized when the material parameter refers to a texture, the initial value of the material parameter to be optimized may be the initial texture.
  • the material parameters to be optimized of the target object may refer to material parameters that have not been optimized among the material parameters of the target object.
  • the terminal Before optimizing the material parameters to be optimized, the terminal may display a setting interface, and then obtain the material parameters to be optimized of the target object in response to the user's triggering operation on the setting interface, that is, the initial value may be manually set by the user.
  • the terminal may also initialize the material parameters to be optimized according to a preset initialization algorithm to obtain the initial values of the material parameters to be optimized. At this time, the terminal automatically obtains the initial values of the material parameters to be optimized without manual setting by the user.
  • the preset initialization algorithm can be selected according to actual conditions. For example, a random initialization algorithm or a standard initialization algorithm can be used as the preset initialization algorithm in the embodiment of the present application, and this embodiment is not limited here.
  • the terminal may obtain the initial value of the material parameter to be optimized when obtaining the material parameter to be optimized of the target object, or the terminal may obtain the initial value of the material parameter to be optimized after obtaining the material parameter to be optimized of the target object, which is not limited in the embodiments of the present application.
  • S202 Acquire a captured image of the target object and determine shooting parameters of the captured image.
  • the captured image of the target object may include at least one, wherein the target object may be photographed under the same shooting parameters to obtain a preset number of captured images, or one captured image may be taken under one shooting parameter, which is not limited in this example.
  • the shooting parameters of the captured image may refer to the parameters when shooting the target object, which may include at least one of the parameters of the shooting environment of the target object and the parameters of the camera.
  • the parameters of the shooting environment of the target object may include the weather of the shooting environment of the target object, and the weather of the shooting environment may include rain, light intensity, light color or light source position, etc.
  • the parameters of the camera may include the positional relationship between the camera and the target object, and the positional relationship may include at least one of a distance and an angle.
  • the terminal may obtain the captured image of the target object through its own camera, or may obtain the captured image of the target object through the camera of other terminals, which then send the captured image to the terminal, so that the terminal obtains the captured image of the target object.
  • S203 Render the target object according to the initial value and the shooting parameters to obtain a rendered image of the target object.
  • Rendering refers to the process of adjusting the model corresponding to the object into an image according to the set environment, light and material parameters.
  • the target object is rendered to obtain a rendered image of the target object.
  • a rendering effect of rendering the target object according to the initial values is determined, and then the target object is rendered according to the rendering effect, so as to obtain a rendered image of the target object.
  • the shooting parameter is illuminance and the illuminance is 280 lux
  • the material parameter to be optimized is the color material parameter
  • the initial value can be (255, 0, 0)
  • the terminal calculates at 280 lux, determines the rendering effect of rendering the target object according to (255, 0, 0), and then renders the target object according to the rendering effect.
  • the terminal may render the target object according to the initial values and the shooting parameters through a renderer (eg, a rendering pipeline), thereby obtaining a rendered image of the target object.
  • a renderer eg, a rendering pipeline
  • the renderer can be selected according to actual conditions.
  • the renderer can be a renderer in the Unreal Engine or the Unity Engine, which is not limited in the embodiments of the present application.
  • S204 Determine difference information between the rendered image and the photographed image according to the rendered image and the photographed image.
  • the terminal can determine the ratio between the image parameter values of the rendered image and the image parameter values of the captured image, and then use the ratio as the difference information between the rendered image and the captured image, or the terminal can also determine the difference between the image parameter values of the rendered image and the image parameter values of the captured image, and then use the difference as the difference information between the rendered image and the captured image.
  • the image parameter values of the rendered image may include all image parameter values of the rendered image, and the image parameter values of the captured image may include all image parameter values of the captured image.
  • the image parameter values of the rendered image may only include the rendered values corresponding to the material parameters to be optimized, and the image parameter values of the captured image may include the real values corresponding to the material parameters to be optimized. This embodiment is not limited here.
  • determining the difference information between the rendered image and the captured image according to the rendered image and the captured image including:
  • the difference information between the rendered image and the photographed image is determined.
  • the material parameter to be optimized refers to the color material parameter
  • the color value of the rendered image is A
  • the color value of the captured image is B
  • the difference between A and B is used as the difference information between the rendered image and the captured image.
  • the color material parameter includes an R parameter, a G parameter, and a B parameter
  • the difference information between the rendered image and the captured image can be determined based on the sub-difference information of each sub-material parameter.
  • the rendered value may include the rendered value of each sub-material parameter
  • the true value may include the true value of each sub-material parameter.
  • the difference information between the rendered image and the captured image is determined based on the rendered value and the true value, including:
  • the difference information between the rendered image and the captured image is determined according to each piece of sub-difference information.
  • the terminal may directly add the sub-difference information of each sub-material parameter, thereby obtaining the difference information between the rendered image and the captured image.
  • the terminal can first calculate the square of each sub-difference information, and then add the squares of each sub-difference information to obtain the difference information between the rendered image and the captured image.
  • the difference information between the rendered image and the captured image is determined based on the Euclidean distance between each sub-material parameter.
  • the difference information between the rendered image and the captured image is determined according to the rendered image and the captured image, including:
  • difference information between the rendered image and the captured image is determined.
  • the converted color space may be selected according to actual conditions.
  • the color space of the rendered image may be converted into a Lab color space or an HSV color space, which is not limited in this embodiment.
  • the color information of the rendered image and the color information of the captured image are described using the RGB color space.
  • the calculation of the color information described using the RGB color space is relatively complicated, and the user's sensitivity to the color information described in the RGB color space is relatively low, resulting in a low accuracy of the target value obtained based on the difference information.
  • the rendered image is first subjected to a color space conversion to obtain a converted rendered image
  • the captured image is subjected to a color space conversion to obtain a converted captured image
  • the difference information between the rendered image and the captured image is determined based on the converted rendered image and the converted captured image, thereby reducing the amount of calculation in the process of determining the difference information and improving the accuracy of the target value obtained based on the difference information.
  • the initial value is optimized according to the difference information to obtain the target value of the material parameter to be optimized. If the difference information meets the preset conditions, the initial value is used as the target value of the material parameter to be optimized.
  • the rendered images also include multiple images.
  • the difference information between each rendered image and the captured image can be added to obtain the total difference information, and the initial value can be adjusted according to the total difference information to obtain the target value of the material parameter to be optimized, or the initial value can be adjusted according to the average value of the total difference information to obtain the target value of the material parameter to be optimized.
  • the staff who manually sets the values of material parameters.
  • the staff In the process of setting the values of material parameters, the staff generally relies on their subjective consciousness to judge whether the rendered image obtained according to the material parameters meets the user's visual effect. If not, the material parameter values are adjusted again until the rendered image obtained according to the material parameters meets the user's visual effect.
  • the initial values of the material parameters to be optimized are optimized according to the difference information between the rendered image and the captured image, and the automatic setting of the values of the material parameters to be optimized is realized, thereby improving the setting efficiency of the values of the material parameters to be optimized, and by judging whether the initial values of the material parameters to be optimized meet the preset conditions according to the difference information between the rendered image and the captured image, the image obtained after rendering the target object according to the target value of the material parameter to be optimized is as close to the captured image as possible, without the need for subjective judgment by the staff, thereby improving the accuracy of the values of the material parameters to be optimized.
  • the values of the material parameters to be optimized can be quickly determined and the obtained values of the material parameters to be optimized are more accurate, the efficiency and accuracy of the rendered image obtained by rendering the target object according to the values of the material parameters to be optimized are higher.
  • the initial value in order to obtain the target value more accurately, the initial value may be optimized multiple times. At this time, the initial value is adjusted according to the difference information to obtain the target value of the material parameter to be optimized, including:
  • the initial value is adjusted according to the difference information to obtain a candidate value
  • the candidate value is used as the initial value of the material parameter to be optimized, and the step of rendering the target object according to the initial value and the shooting parameters is returned;
  • the initial value is used as the target value of the material parameter to be optimized.
  • the initial value of the material parameter to be optimized is set to A1, and the target object is rendered according to A1 and the shooting parameters to obtain the rendered image m1 of the target object.
  • the difference information d1 between the rendered image m1 and the shot image is determined. If d1 is greater than a preset threshold, A1 is adjusted according to d1 to obtain a candidate value A2, and the target object is rendered according to A2 and the shooting parameters to obtain the rendered image m2 of the target object.
  • the difference information d2 between the rendered image m2 and the shot image is determined. If d2 is less than or equal to the preset threshold, A2 is used as the target value of the material parameter to be optimized. If d1 is less than or equal to the preset threshold, A1 is used as the target value of the material parameter to be optimized.
  • the step returned for execution may also be to obtain a captured image of the target object and determine the shooting parameters of the captured image.
  • the material parameter to be optimized can be optimized through different captured images.
  • the value of the material parameter to be optimized will reach a convergence state when the number of adjustments reaches a certain number, in the process of multiple adjustments to the initial value, it is also possible to determine whether to stop the optimization based on the number of adjustments. That is, after obtaining the difference information, the current number of adjustments is determined. If the current number of adjustments is less than the preset number of adjustments, the initial value is adjusted according to the difference information to obtain the candidate value of the material parameter to be optimized. If the current number of adjustments is equal to the preset number of adjustments, the initial value is used as the target value of the material parameter to be optimized.
  • the initial value is optimized multiple times according to the difference information, thereby improving the accuracy of the target value of the material parameter to be optimized.
  • the initial value is adjusted according to the difference information to obtain the target value of the material parameter to be optimized, including:
  • the initial value is adjusted to obtain the target value of the material parameter to be optimized.
  • the change amount of the material parameter to be optimized may refer to at least one of the change direction and change magnitude of the material parameter to be optimized, representing the change trend of the material parameter to be optimized.
  • the change direction may refer to the gradient direction of the material parameter to be optimized.
  • the direction of change of the material parameters to be optimized can be determined by the derivative obtained by derivation of the material parameters to be optimized based on the difference information, and the magnitude of the change of the material parameters to be optimized can be determined based on the magnitude of the difference information.
  • the larger the difference information the larger the change of the material parameters to be optimized, and the smaller the difference information, the smaller the change of the material parameters to be optimized.
  • the initial value is adjusted according to the variation to obtain a target value of the material parameter to be optimized, including:
  • the initial value is adjusted to obtain the target value of the material parameter to be optimized.
  • setting the optimization step size (also referred to as an optimization hyperparameter) can be used to determine the degree of optimization corresponding to the initial value, thereby avoiding missing the optimal value of the material parameter to be optimized.
  • the change amount can be expressed numerically, and the optimization step size and the change amount are multiplied and fused, for example, the two are multiplied to obtain the optimization degree corresponding to the initial value, and then the optimization degree and the initial value are additively fused, for example, the two are added to obtain the target value of the material parameter to be optimized.
  • the optimization step size may be fixed or may be adjusted dynamically according to the difference information, which is not limited in this embodiment.
  • the terminal can optimize the material parameters one by one, or optimize the material parameters at the same time.
  • the material parameters to be optimized can be first screened out from the at least two material parameters of the target object, and then the values of the candidate material parameters are fixed, and the values of the material parameters to be optimized are optimized, and the candidate material parameters are parameters in the material parameters except for the at least two material parameters to be optimized.
  • the target object is rendered according to the initial value and the shooting parameters to obtain a rendered image of the target object, including:
  • the target object is rendered according to the initial value, the set value corresponding to the candidate material parameter, and the shooting parameter to obtain a rendered image of the target object.
  • the candidate material parameters may be optimized parameters or unoptimized parameters
  • the set values of the candidate material parameters may refer to the initial values of the candidate material parameters or the target values of the candidate material parameters.
  • the material parameters of the target object are color material parameters and transparency material parameters.
  • the terminal first optimizes the transparency material parameters as the material parameter to be optimized. At this time, the color material parameters have not been optimized. Therefore, the setting value of the color material parameters is the initial value of the color material parameters.
  • the target value of the transparency material parameters is obtained, and the color material parameters are optimized as the material parameter to be optimized. At this time, the transparency material parameters have been optimized. Therefore, in the process of optimizing the color material parameters, the setting value of the transparency material parameters is the target value of the transparency material parameters.
  • selecting a material parameter to be optimized from at least two material parameters includes:
  • the material parameter to be optimized is screened out from at least two material parameters.
  • the optimization order of the material parameters can be determined according to the correlation between the material parameters, and then the material parameters to be optimized can be screened out from at least two material parameters according to the optimization order, so that the accuracy of the values of the material parameters of the target object finally obtained is relatively high.
  • the terminal can set the influential material parameters to be at the end of the optimization order, and the material parameters that have no influence to be at the front of the optimization order.
  • the material parameters include color material parameters and transparency material parameters.
  • the transparency material parameters have no effect on the color material parameters. Therefore, the transparency material parameters can be optimized first, and then the color material parameters can be optimized.
  • the terminal optimizes the material parameters one by one, when the optimization of the material parameters to be optimized is completed, if there are still unoptimized material parameters, it is necessary to optimize the unoptimized material parameters.
  • the initial value according to the difference information and obtaining the target value of the material parameter to be optimized it also includes:
  • the process returns to execute the step of selecting the material parameter to be optimized from at least two material parameters.
  • optimization status of the material parameters is to be optimized, it means that the material parameters are unoptimized parameters, so the material parameters still need to be optimized.
  • the material parameters of the target object are color material parameters and transparency material parameters.
  • the terminal first optimizes the transparency material parameters as the material parameters to be optimized, that is, according to the initial value of the transparency material parameters, the shooting parameters and the set values of the color material parameters, the target object is rendered to obtain a rendered image. If the difference information between the rendered image and the shot image meets the preset conditions, the optimization of the transparency material parameters is completed. At this time, the color material parameters have not been optimized, so the optimization state of the color material parameters is the state to be optimized. Therefore, the color material parameters are taken as the material parameters to be optimized, and the values of the color material parameters are continuously optimized.
  • all material parameters of the target object are optimized by optimizing the material parameters one by one.
  • determining difference information between the rendered image and the photographed image according to the rendered image and the photographed image includes:
  • the difference information between the rendered image and the captured image is determined.
  • the terminal can determine the initial difference information between the rendered image and the captured image of each material parameter to be optimized based on the rendering value corresponding to each material parameter to be optimized in the rendered image and the real value corresponding to the captured image, and then add up the initial difference information to obtain the difference information between the rendered image and the captured image.
  • the process of determining the difference information between the rendered image and the photographed image according to the rendered image and the photographed image may also be:
  • the difference information between the rendered image and the photographed image is determined according to the total rendered value and the total true value.
  • the optimization step size corresponding to each material parameter may be different or the same, and the embodiment of the present application does not limit this.
  • all material parameters of the target object are optimized at the same time, reducing the time required to obtain the target value of each material parameter, thereby further improving the speed of obtaining the target value of the material parameter of the target object.
  • the terminal can immediately render the target object using the target value according to the actual application scenario, thereby obtaining a rendered image of the target object.
  • the terminal can also render the target object using the target value according to the actual application scenario after a preset time interval, thereby obtaining a rendered image of the target object.
  • the embodiments of the present application are not limited here.
  • the target value corresponding to the material parameter to be optimized of the target object in different application scenarios may be the same.
  • the rendering effect after rendering the target object using the target value may be different.
  • the application scenario includes a first application scenario and a second application scenario.
  • the first application scenario is that moonlight shines on the ground and there is car paint on the ground.
  • the second application scenario is that sunlight shines on the ground and there is car paint on the ground.
  • the car paint in the first application scenario and the car paint in the second application scenario are rendered respectively.
  • the rendering effect of the car paint in the first application scenario and the rendering effect of the car paint in the second application scenario are different.
  • the material parameters to be optimized of the target object are obtained, and the material parameters to be optimized are initialized to obtain the initial values of the material parameters to be optimized, a captured image of the target object is obtained, and the shooting parameters of the captured image are determined, the target object is rendered according to the initial values and the shooting parameters to obtain a rendered image of the target object, and difference information between the rendered image and the captured image is determined according to the rendered image and the captured image, and the initial value is adjusted according to the difference information to obtain the target value of the material parameter to be optimized, so as to render the target object according to the target value to obtain a rendered image of the target object, thereby realizing automatic optimization of the initial values of the material parameters to be optimized through the captured image, improving the efficiency and accuracy of optimizing the material parameters to be optimized, and thus improving the efficiency and accuracy of rendering the target object according to the target value.
  • FIG. 3 is a schematic diagram of a flow chart of an image processing method provided in an embodiment of the present application.
  • the image processing method is executed by an electronic device, for example, by the terminal 110 in FIG. 1.
  • the flow chart of the image processing method may include:
  • the terminal obtains a transparency material parameter and a color material parameter of a target object, and obtains an initial value of the transparency material parameter and an initial value of the color material parameter.
  • the transparency material parameter (Opacity) can be a scalar (scalar), such as a floating point type, and the color material parameter (Base Color) can be a three-dimensional vector (vector).
  • the process of optimizing the material parameters by the terminal can include a parameter setting part and a parameter optimization part.
  • the terminal can add scalar parameters and vector parameters in Unreal Engine 4 (UE4).
  • UE4 Unreal Engine 4
  • the parameter setting interface of Unreal Engine 4 is shown in Figure 4 (the parameter value is the initial value of the material parameter).
  • the terminal can create a material parameter collection (Material Parameter Collection, MPC) in Unreal Engine 4, and then set the transparency material parameter and color material parameter in the material parameter collection. parameter.
  • MPC Material Parameter Collection
  • the terminal After setting the initial value of the transparency material parameter and the initial value of the color material parameter in the material parameter set, the terminal inputs the material parameter set into the renderer of Unreal Engine 4.
  • the terminal may only input the initial value of the material parameter to be optimized in the material parameter set into the renderer of Unreal Engine 4, and the candidate material parameter to be optimized may adopt the default parameter value.
  • the initial value of the transparency material parameter is input into the renderer of Unreal Engine 4, wherein the data types involved in the renderer (Fresnel) in FIG5 include exponent (Exponenth), base reflection fraction (BaseReflectFraction), and norm (Normal).
  • the terminal determines an association relationship between a transparency material parameter and a color material parameter, and based on the association relationship, selects the transparency material parameter as a material parameter to be optimized and the color material parameter as a candidate material parameter from the transparency material parameter and the color material parameter.
  • S303 The terminal obtains a captured image of the target object and determines the illumination of the captured image.
  • S304 The terminal renders the target object according to the set values of the candidate material parameters, the initial values of the material parameters to be optimized, and the illumination to obtain a rendered image of the target object.
  • S305 The terminal determines a rendering value corresponding to the material parameter to be optimized in the rendered image and a real value corresponding to the captured image, and determines whether the material parameter to be optimized is a color material parameter.
  • the terminal determines a difference between the rendered value and the true value, and uses the difference between the rendered value and the true value as difference information between the rendered image and the captured image.
  • the terminal performs color space conversion on the rendering value and the real value to obtain a converted rendering value and a converted real value.
  • the rendering value when the rendering value is a color value expressed in the RGB color space, the rendering value can be converted into a color value expressed in the LAB color space.
  • the rendering value in the RGB color space can be first converted into a candidate rendering value in the XYZ color space, and then the candidate rendering value in the XYZ color space can be converted into the converted rendering value in the LAB color space. That is, the rendering value can be substituted into the following formula for conversion to obtain the converted rendering value:
  • M represents the conversion matrix
  • Xn , Yn and Zn represent the white reference points under the CIE1931 standard
  • Xn 95.047
  • Yn 100.0
  • Zn 108.883
  • Li , ai and bi represent the converted rendering values
  • ai and bi represent the converted rendering values
  • the process of converting the real value may refer to the process of converting the rendering value, which will not be described in detail in this embodiment.
  • S308 The terminal determines a difference between the converted rendering value and the converted real value, and uses the difference between the converted rendering value and the converted real value as difference information between the rendered image and the captured image.
  • the color material parameter may include three sub-material parameters
  • the sub-difference between the converted rendering value and the converted real value of each sub-material parameter may be solved separately, and then the three sub-differences may be added together to obtain the difference between the converted rendering value and the converted real value.
  • ⁇ L represents the sub-difference value between the converted rendered value and the converted real value of the brightness sub-material parameter
  • ⁇ a represents the sub-difference value between the converted rendered value and the converted real value of the red and green sub-material parameters
  • ⁇ b represents the sub-difference value between the converted rendered value and the converted real value of the yellow and blue sub-material parameters
  • ⁇ E represents the difference between the converted rendered value and the converted real value
  • L r , a r and br represent the converted real values.
  • S3010 The terminal adjusts the initial value of the material parameter to be optimized according to the gradient of the material parameter to be optimized and the optimization step size corresponding to the material parameter to be optimized to obtain a candidate value.
  • the terminal can perform multiplication fusion processing on the gradient of the material parameter to be optimized and the optimization step size corresponding to the material parameter to be optimized, so as to obtain the optimization degree of the material parameter to be optimized, and then perform addition fusion processing on the optimization degree and the initial value, so as to obtain the optimized candidate value of the material parameter to be optimized.
  • the difference information can be substituted into the following formula for fusion processing to obtain the optimized candidate value of the material parameter to be optimized:
  • L k+1 , a k+1 and b k+1 represent the optimized candidate values of the material parameters to be optimized
  • L k , a k and b k represent the initial values of the material parameters to be optimized
  • represents the optimization step size of the brightness sub-material parameter
  • represents the optimization step size of the red and green sub-material parameters
  • k represents the number of adjustments.
  • the terminal uses the candidate value as the initial value of the material parameter to be optimized, and returns to execute step S303.
  • the difference information can also be understood as a loss function value, that is, the rendering value and the true value are substituted into the loss function for calculation, thereby obtaining the loss function value, and the loss function value is used as the difference information.
  • the optimization process of the material parameter in the embodiment of the present application can be shown in Figure 6.
  • the terminal inputs the initial value into the Unreal Engine to render the target object to obtain a rendered image, and then performs color space conversion on the rendering value of the rendered image and the real value of the captured image to obtain the converted rendering value and the converted real value.
  • the converted rendering value and the converted real value are substituted into the loss function to calculate the loss function value, and the gradient of the material parameter to be optimized is calculated according to the loss function value.
  • the initial value of the material parameter to be optimized is optimized according to the gradient descent to obtain the candidate value. Then continue to optimize the candidate values until the loss function value meets the preset conditions.
  • the terminal determines the optimization status of the color material parameters.
  • the terminal may render the target object according to the target value of the color material parameter and the target value of the transparency material parameter to obtain a rendered image of the target object.
  • the difference between the rendered image and the captured image under the same lighting environment is calculated, and then the derivative of the difference with respect to the material parameters to be optimized is calculated, and gradient descent optimization is performed. After several rounds of iterative calculations, a rendered image closest to the captured image can be obtained, thereby obtaining the target value of the material parameter closest to the captured image.
  • the car paint is photographed when the illumination is 280 lux, 210 lux, 160 lux, 110 lux and 65 lux to obtain photographed images of the car paint, and then the transparency material parameters are optimized according to the photographed images at 280 lux, 210 lux, 160 lux, 110 lux and 65 lux.
  • the initial value of the transparency material parameter is 0.2 and the difference information between the rendered image and the photographed image is 4.243824 and is less than the preset threshold, the initial value 0.2 is used as the target value of the transparency material parameter.
  • the terminal optimizes the color material parameter according to the captured image at 280 lux, 210 lux, 160 lux, 110 lux, 65 lux and the target value of the transparency material parameter is 0.2.
  • the initial value of the color material parameter is 0.86 and the difference information between the rendered image and the captured image is 3.043818 and is less than the preset threshold, the initial value 0.86 is used as the target value of the color material parameter.
  • the embodiment of the present application also provides a device based on the above image processing method.
  • the meanings of the terms are the same as those in the above image processing method, and the specific implementation details can refer to the description in the method embodiment.
  • the image processing device may include:
  • the parameter acquisition module 701 is used to acquire the material parameters to be optimized of the target object and to acquire the initial values of the material parameters to be optimized;
  • the image acquisition module 702 is used to acquire a captured image of the target object and determine the capture parameters of the captured image
  • the object rendering module 703 is used to render the target object according to the initial value and the shooting parameters to obtain a rendered image of the target object;
  • the parameter optimization module 705 is used to adjust the initial value according to the difference information to obtain the target value of the material parameter to be optimized, so as to render the target object according to the target value to obtain the rendered image of the target object.
  • the information determination module 704 is specifically configured to execute:
  • the difference information between the rendered image and the photographed image is determined.
  • the material parameter to be optimized includes at least two sub-material parameters
  • the rendering value includes the rendering value of each sub-material parameter of the at least two sub-material parameters
  • the real value includes the real value of each sub-material parameter
  • the information determination module 704 is specifically configured to execute:
  • the difference information between the rendered image and the captured image is determined according to each piece of sub-difference information.
  • parameter optimization module 705 is specifically used to perform:
  • the initial value is adjusted according to the difference information to obtain a candidate value
  • the candidate value is used as the initial value of the material parameter to be optimized, and the step of rendering the target object according to the initial value and the shooting parameters is returned;
  • the initial value is used as the target value of the material parameter to be optimized.
  • parameter optimization module 705 is specifically used to perform:
  • the initial value is adjusted to obtain the target value of the material parameter to be optimized.
  • parameter optimization module 705 is specifically used to perform:
  • the initial value is adjusted to obtain the target value.
  • the material parameters to be optimized include color material parameters.
  • the information determination module 704 is specifically configured to execute:
  • difference information between the rendered image and the captured image is determined.
  • parameter acquisition module 701 is specifically used to execute:
  • the object rendering module 703 is specifically used to execute:
  • the target object is rendered according to the initial value, the set value corresponding to the candidate material parameter and the shooting parameter to obtain a rendered image of the target object, wherein the candidate material parameter is a parameter of the at least two material parameters except the material parameter to be optimized.
  • parameter acquisition module 701 is specifically used to execute:
  • the material parameter to be optimized is screened out from at least two material parameters.
  • the parameter acquisition module 701 is further configured to execute:
  • the process returns to execute the step of selecting the material parameter to be optimized from at least two material parameters.
  • the information determination module 704 is specifically configured to execute:
  • the difference information between the rendered image and the captured image is determined.
  • the above modules can be implemented as independent entities, or can be arbitrarily combined to be implemented as the same or several entities.
  • the specific implementation methods and corresponding beneficial effects of the above modules can be found in the previous method embodiments, which will not be repeated here.
  • the embodiment of the present application also provides an electronic device, which may be a server or a terminal, etc., as shown in FIG8, which shows a schematic diagram of the structure of the electronic device involved in the embodiment of the present application.
  • the electronic device may include a processor 801 of one or more processing cores, a memory 802 of one or more computer-readable storage media, a power supply 803, and an input unit 804.
  • the electronic device structure shown in FIG8 does not constitute a limitation on the electronic device, and may include more or fewer components than shown, or combine certain components, or arrange components differently.
  • the processor 801 is the control center of the electronic device, which uses various interfaces and lines to connect various parts of the entire electronic device, and executes various functions of the electronic device and processes data by running or executing computer programs and/or modules stored in the memory 802, and calling data stored in the memory 802.
  • the processor 801 may include one or more processing cores; preferably, the processor 801 may integrate an application processor and a modem processor, wherein the application processor mainly processes the operating system, user interface, and application programs, and the modem processor mainly processes wireless communications. It is understandable that the above-mentioned modem processor may not be integrated into the processor 801.
  • the memory 802 can be used to store computer programs and modules.
  • the processor 801 executes various functional applications and data processing by running the computer programs and modules stored in the memory 802.
  • the memory 802 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, a computer program required for at least one function (such as a sound playback function, an image playback function, etc.), etc.; the data storage area may store data created according to the use of the electronic device, etc.
  • the memory 802 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one disk storage device, a flash memory device, or other volatile solid-state storage devices. Accordingly, the memory 802 may also include a memory controller to provide the processor 801 with access to the memory 802.
  • the electronic device also includes a power supply 803 for supplying power to each component.
  • the power supply 803 can be logically connected to the processor 801 through a power management system, so as to manage charging, discharging, power consumption and other functions through the power management system.
  • the power supply 803 can also include one or more DC or AC power supplies, recharging systems, power failure detection circuits, power converters or inverters, power status indicators and other arbitrary components.
  • the electronic device may further include an input unit 804, which may be used to receive input digital or character information and generate keyboard, mouse, joystick, optical or trackball signal input related to user settings and function control.
  • an input unit 804 which may be used to receive input digital or character information and generate keyboard, mouse, joystick, optical or trackball signal input related to user settings and function control.
  • the electronic device may further include a display unit, etc., which will not be described in detail herein.
  • the processor 801 in the electronic device will load the executable files corresponding to the processes of one or more computer programs into the memory 802 according to the following instructions, and the processor 801 will run the computer programs stored in the memory 802, thereby realizing various functions, such as:
  • the initial value is adjusted to obtain the target value of the material parameter to be optimized, and the target object is rendered according to the target value to obtain a rendered image of the target object.
  • an embodiment of the present application provides a computer-readable storage medium, in which a computer program is stored, and the computer program can be loaded by a processor to execute the steps in any image processing method provided in the embodiment of the present application.
  • the computer program can execute the following steps:
  • the initial value is adjusted to obtain the target value of the material parameter to be optimized, and the target object is rendered according to the target value to obtain a rendered image of the target object.
  • the computer readable storage medium may include: read-only memory (ROM), random access memory (RAM), disk or CD, etc.
  • the computer program stored in the computer-readable storage medium can execute the steps in any image processing method provided in the embodiments of the present application, the beneficial effects that can be achieved by any image processing method provided in the embodiments of the present application can be achieved. Please refer to the previous embodiments for details and will not be repeated here.
  • a computer program product or a computer program includes computer instructions, the computer instructions are stored in a computer-readable storage medium.
  • a processor of a computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the above-mentioned image processing method.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

本申请实施例公开了一种图像处理方法、装置、设备、存储介质及程序产品。该方法包括:获取目标对象的待优化材质参数,并获取所述待优化材质参数的初始值;获取所述目标对象的拍摄图像,并确定所述拍摄图像的拍摄参数;根据所述初始值和所述拍摄参数,对所述目标对象进行渲染,得到所述目标对象的渲染图像;根据所述渲染图像和所述拍摄图像,确定所述渲染图像和所述拍摄图像之间的差异信息;及,根据所述差异信息,对所述初始值进行调整,得到所述待优化材质参数的目标值,以根据所述目标值对所述目标对象进行渲染,得到所述目标对象的渲染后的图像。

Description

图像处理方法、装置、设备、存储介质及程序产品
本申请要求于2022年11月25日提交中国专利局、申请号为202211493839.4、申请名称为“图像处理方法、装置、设备、存储介质及程序产品”的中国专利申请的优先权。
技术领域
本申请涉及图像处理技术领域,具体涉及一种图像处理方法、装置、设备、存储介质及程序产品。
发明背景
随着科学技术的发展,图像技术的应用越来越广泛,比如,将图像技术应用于游戏、动画或者广告等。
在通过图像技术展现图像之前,需要对图像中的对象进行渲染,其过程为:设置图像中对象的材质参数,根据材质参数对图像中的对象进行渲染,从而得到该图像。然而,目前,图像中对象的材质参数是通过手动方式设置的,导致渲染的效率和准确度均较低。
发明内容
本申请实施例提供一种图像处理方法、装置、设备、存储介质及程序产品,可以解决手动方式设置材质参数的值,导致渲染的效率和准确度均较低的技术问题。
一方面,本申请实施例提供一种图像处理方法,包括:
获取目标对象的待优化材质参数,并获取所述待优化材质参数的初始值;
获取所述目标对象的拍摄图像,并确定所述拍摄图像的拍摄参数;
根据所述初始值和所述拍摄参数,对所述目标对象进行渲染,得到所述目标对象的渲染图像;
根据所述渲染图像和所述拍摄图像,确定所述渲染图像和所述拍摄图像之间的差异信息;及,
根据所述差异信息,对所述初始值进行调整,得到所述待优化材质参数的目标值,以根据所述目标值对所述目标对象进行渲染,得到所述目标对象的渲染后的图像。
另一方面,本申请实施例还提供一种图像处理装置,包括:
参数获取模块,用于获取目标对象的待优化材质参数,并获取所述待优化材质参数的初始值;
图像获取模块,用于获取所述目标对象的拍摄图像,并确定所述拍摄图像的拍摄参数;
对象渲染模块,用于根据所述初始值和所述拍摄参数,对所述目标对象进行渲染,得到所述目标对象的渲染图像;
信息确定模块,用于根据所述渲染图像和所述拍摄图像,确定所述渲染图像和所述拍摄图像之间的差异信息;及,
参数优化模块,用于根据所述差异信息,对所述初始值进行调整,得到所述待优化材质参数的目标值,以根据所述目标值对所述目标对象进行渲染,得到所述目标对象的渲染后的图像。
另一方面,本申请实施例还提供一种电子设备,包括处理器和存储器,上述存储器存储有计算机程序,上述处理器用于运行上述存储器内的计算机程序实现本申请实施例提供的图像处理方法。
另一方面,本申请实施例还提供一种计算机可读存储介质,上述计算机可读存储介质存储有计算机程序,上述计算机程序适于处理器进行加载,以执行本申请实施例所提供的任一种图像处理方法。
另一方面,本申请实施例还提供一种计算机程序产品,包括计算机程序,所述计算机程序被处理器执行时实现本申请实施例所提供的任一种图像处理方法。
附图简要说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请实施例提供的图像处理过程的场景示意图;
图2是本申请实施例提供的图像处理方法的流程示意图;
图3是本申请实施例提供的另一种图像处理方法的流程示意图;
图4是本申请实施例提供的材质参数的设置过程的示意图;
图5是本申请实施例提供的另一种材质参数的设置过程的示意图;
图6是本申请实施例提供的材质参数优化方法的示意图;
图7是本申请实施例提供的图像处理装置的结构示意图;
图8是本申请实施例提供的电子设备的结构示意图。
实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
本申请实施例提供一种图像处理方法、装置、设备、存储介质及程序产品,其中,设备可以为电子设备,存储介质可以为计算机存储介质,程序产品可以为计算机程序产品。该图像处理装置可以集成在电子设备中,该电子设备可以是服务器,也可以是终端等设备。
其中,服务器可以是独立的物理服务器,也可以是多个物理服务器构成的服务器集群或者分布式系统,还可以是提供云服务、云数据库、云计算、云函数、云存储、网络服务、云通信、中间件服务、域名服务、安全服务、网络加速服务(Content Delivery Network,CDN)、以及大数据和人工智能平台等基础云计算服务的云服务器。
并且,多个服务器可组成为一区块链,而服务器为区块链上的节点。
终端可以是智能手机、平板电脑、笔记本电脑、台式计算机、智能音箱、智能手表、虚拟现实设备(Virtual Reality,VR)以及增强现实设备(Augmented Reality,AR)等,但并不局限于此。终端以及服务器可以通过有线或无线通信方式进行直接或间接地连接,本申请在此不做限制。
例如,图1是本申请实施例提供的图像处理过程的场景示意图,包括终端110和服务器120。如图1所示,服务器120获取目标对象的待优化材质参数,并对待优化材质参数进行初始化处理,得到待优化材质参数的初始值,获取目标对象的拍摄图像,并确定拍摄图像的拍摄参数,根据初始值和拍摄参数,对目标对象进行渲染,得到目标对象的渲染图像,根据渲染图像和拍摄图像,确定渲染图像和拍摄图像之间的差异信息,根据差异信息,对初始值进行优化(具体为调整),得到待优化材质参数的目标值。终端110发送目标对象的参数获取请求至服务器120,服务器120根据参数请求将待优化材质参数的目标值返回至终端110,终端110根据目标值对目标对象进行渲染,得到目标对象的渲染后的图像。
本申请实施例的图像处理方法,可以应用于各种对对象进行渲染的场景,比如,本申请实施例的图像处理方法,可以用于对游戏中的对象进行渲染,游戏中的对象比如可以为游戏道具、游戏角色或者游戏背景等,又比如,本申请实施例的图像处理方法,也可以用于动画中的对象进行渲染,动画中的对象可以为动画中人物、动物或者景点等。
另外,本申请实施例中的“多个”指两个或两个以上。本申请实施例中的“第一”和“第二”等用于区分描述,而不能理解为暗示相对重要性。
以下分别进行详细说明。需要说明的是,以下实施例的描述顺序不作为对实施例优选顺序的限定。
下面对本申请涉及到的名称进行解释说明。
虚幻引擎(Unreal Engine),指由编写各种游戏所需的各种工具组成的框架,游戏开发者可以直接调用虚幻引擎中各个工具,从而快速地做出游戏应用程序而不用从零开始。
渲染管线(Render Pipeline):指将数据从3D场景转换成2D图像,最终在屏幕上显示出来的总过程,渲染管线可以包括应用阶段(Application Stage)、几何阶段(Geometry Processing)以及光栅化阶段(Rasterization Stage)。其中,应用阶段指获取场景数据的过程,场景数据包括顶点三维坐标、光源、摄像机位置、视锥体、以及每个对象的材质等。几何阶段包括对场景数据进行计算过程,计算过程包括坐标变化、顶点着色、投影变化以及裁剪等过程,光栅化阶段包括将3D连续的对象转换为离散屏幕像素点的过程以及根据光照和材质参数确定屏幕像素点的颜色等过程。
本申请实施例可以将目标对象的待优化材质参数的初始值和拍摄参数输入至渲染管线中进行渲染, 从而得到目标对象的渲染图像。
在本实施例中,将从图像处理装置的角度进行描述,为了方便对本申请的图像处理方法进行说明,以下将以图像处理装置集成在终端中进行详细说明,即以终端作为执行主体进行详细说明。
请参阅图2,图2是本申请一实施例提供的图像处理方法的流程示意图。该图像处理方法由电子设备执行,例如由图1中的终端110执行、或者由服务器120执行、或者二者交互来完成。该图像处理方法可以包括:
S201、获取目标对象的待优化材质参数,并获取待优化材质参数的初始值。
目标对象可以指现实中存在的对象,其可以是人物,也可以是宠物,或者,也可以是植物,或者,也可以是各种产品,比如,目标对象可以为车漆或者石头等。
目标对象在渲染器中可以以模型的形式存在,模型指由点、线或面组成的、且没有颜色和纹理等信息的图形。
当目标对象为游戏对应的玩家时,对目标对象进行渲染之后,可以得到游戏玩家在游戏中对应的游戏角色。
材质参数指表明目标对象的可视属性的参数,其包括颜色、纹理、光滑度、透明度、反射率、折射率以及中发光度的至少一种。应理解,不同目标对象的材质参数可能不相同,也可能相同。
比如,当目标对象为车漆时,目标对象的材质参数可以指车漆的透明度和颜色。又比如,当目标对象为石头时,目标对象的材质参数可以指石头的光滑度以及颜色等。
材质参数可以通过两个类型表示,一种是通过贴图的方式,表示目标对象的材质参数,另外一种是通过具体的数值,表示目标对象的材质参数。
在本申请实施例中,当材质参数通过具体的数值表示时,待优化材质参数的初始值可以为该数值的初始值。当材质参数指贴图时,待优化材质参数的初始值可以为初始贴图。
目标对象的待优化材质参数可以指目标对象的材质参数中未被优化过的材质参数。
终端在对待优化材质参数优化之前,可以显示设置界面,然后响应于用户对设置界面的触发操作,获取到目标对象的待优化材质参数,即初始值可以由用户手动设置。
或者,终端也可以在获取到待优化材质参数之后,按照预设初始化算法,对待优化材质参数进行初始化,从而得到待优化材质参数的初始值,此时,终端自动得到待优化材质参数的初始值,无需用户进行手动设置。
预设初始化算法,可以根据实际情况进行选择,比如,可以采用随机初始化算法或标准初始化算法作为本申请实施例中的预设初始化算法,本实施例在此不做限定。
终端可以在获取目标对象的待优化材质参数时,获取待优化材质参数的初始值,或者,终端也可以在获取到目标对象的待优化材质参数之后,再获取待优化材质参数的初始值,本申请实施例在此不做限定。
S202、获取目标对象的拍摄图像,并确定拍摄图像的拍摄参数。
目标对象的拍摄图像可以包括至少一张,其中,可以在同一种拍摄参数下,对该目标对象进行拍摄,得到预设数量的拍摄图像,或者,也可以在一种拍摄参数下,拍摄一张拍摄图像,本实例在此不做限定。
拍摄图像的拍摄参数可以指拍摄目标对象时的参数,其可以包括目标对象的拍摄环境的参数和摄像头的参数中的至少一种。其中,目标对象的拍摄环境的参数可以包括目标对象的拍摄环境的天气,拍摄环境的天气包括雨水、光照度、光颜色或者光源位置等,摄像头的参数可以包括摄像头和目标对象之间的位置关系,位置关系包括距离和角度中的至少一种。
终端可以通过本身的摄像头获取目标对象的拍摄图像,或者,也可以通过其他终端的摄像头获取目标对象的拍摄图像,其他终端再将拍摄图像发送给终端,终端从而获取到目标对象的拍摄图像。
S203、根据初始值和拍摄参数,对目标对象进行渲染,得到目标对象的渲染图像。
渲染指按照设定好的环境、光以及材质参数,将对象对应的模型调整为图像的过程。
根据初始值和拍摄参数,对目标对象进行渲染,得到目标对象的渲染图像,具体为在拍摄参数下,确定根据初始值对目标对象进行渲染的渲染效果,然后根据渲染效果对目标对象进行渲染,从而得到目标对象的渲染图像。
比如,拍摄参数为光照度且光照度为280勒克斯,待优化材质参数为颜色材质参数,初始值可以为 (255,0,0),终端计算在280勒克斯下,确定根据(255,0,0)对目标对象进行渲染的渲染效果,然后根据渲染效果对目标对象进行渲染。
终端在获取到拍摄参数之后,可以通过渲染器(如渲染管线),根据初始值和拍摄参数对目标对象进行渲染,从而得到目标对象的渲染图像。
渲染器可以根据实际情况进行选择,比如,渲染器可以为虚幻引擎中或者Unity引擎中的渲染器,本申请实施例在此不做限定。
S204、根据渲染图像和拍摄图像,确定渲染图像和拍摄图像之间的差异信息。
终端可以确定渲染图像的图像参数值和拍摄图像的图像参数值之间的比值,然后将比值作为渲染图像和拍摄图像之间的差异信息,或者,终端也可以确定渲染图像的图像参数值和拍摄图像的图像参数值之间的差值,然后将差值作为渲染图像和拍摄图像之间的差异信息。
渲染图像的图像参数值可以包括渲染图像的所有图像参数值,拍摄图像的图像参数值可以包括拍摄图像的所有图像参数值,或者,渲染图像的图像参数值也可以只包括待优化材质参数对应的渲染值,拍摄图像的图像参数值可以包括待优化材质参数对应的真实值,本实施例在此不做限定。
当渲染图像的图像参数值包括待优化材质参数对应的渲染值,拍摄图像的图像参数值包括待优化材质参数对应的真实值时,根据渲染图像和拍摄图像,确定渲染图像和拍摄图像之间的差异信息,包括:
根据渲染图像,确定待优化材质参数在渲染图像中对应的渲染值;
根据拍摄图像,确定待优化材质参数在拍摄图像中对应的真实值;
根据渲染值和真实值,确定渲染图像和拍摄图像之间的差异信息。
比如,待优化材质参数指颜色材质参数,渲染图像的颜色值为A,拍摄图像的颜色值为B,将A和B之间的差值,作为渲染图像和拍摄图像之间的差异信息。
需要说明的是,一些待优化材质参数由至少两个子材质参数组成,比如,颜色材质参数(RGB)包括R参数、G参数以及B参数,则渲染图像和拍摄图像之间的差异信息可以根据各个子材质参数的子差异信息确定。此时,渲染值可以包括每个子材质参数的渲染值,真实值包括每个子材质参数的真实值,根据渲染值和真实值,确定渲染图像和拍摄图像之间的差异信息,包括:
确定子材质参数的真实值和子材质参数的渲染值之间的子差异信息;
根据各个子差异信息,确定渲染图像和拍摄图像之间的差异信息。
终端在得到各个子材质参数的子差异信息之后,可以直接将各个子材质参数的子差异信息相加,从而得到渲染图像和拍摄图像之间的差异信息。
或者,当子差异信息为子材质参数的真实值和子材质参数的渲染值之间的差值时,终端可以先计算各个子差异信息的平方,然后再将各个子差异信息的平方相加,从而得到渲染图像和拍摄图像之间的差异信息,此时,可以理解为根据各个子材质参数之间的欧式距离,确定渲染图像和拍摄图像之间的差异信息。
在一些实施例中,当待优化材质参数为颜色材质参数时,为了使得得到的目标值更加准确,根据渲染图像和拍摄图像,确定渲染图像和拍摄图像之间的差异信息,包括:
对渲染图像进行颜色空间转换,得到转换后的渲染图像;
对拍摄图像进行颜色空间转换,得到转换后的拍摄图像;
根据转换后的渲染图像和转换后的拍摄图像,确定渲染图像和拍摄图像之间的差异信息。
转换的颜色空间可以根据实际情况进行选择,比如,可以将渲染图像的颜色空间转换为Lab颜色空间或HSV颜色空间,本实施例在此不做限定。
通常,采用RGB颜色空间对渲染图像的颜色信息和拍摄图像的颜色信息进行描述。然而,采用RGB颜色空间描述的颜色信息的计算较为复杂,且用户对RGB颜色空间描述的颜色信息的敏感度较低,导致根据差异信息得到的目标值的准确度较低,因此,本申请实施例中,先对渲染图像进行颜色空间转换,得到转换后的渲染图像,对拍摄图像进行颜色空间转换,得到转换后的拍摄图像,然后再根据转换后的渲染图像和转换后的拍摄图像,确定渲染图像和拍摄图像之间的差异信息,从而减少确定差异信息的过程中的计算量以及提高根据差异信息得到的目标值的准确度。
S205、根据差异信息,对初始值进行调整,得到待优化材质参数的目标值,以根据目标值对目标对象进行渲染,得到目标对象的渲染后的图像。
如果差异信息不满足预设条件,则根据差异信息,对初始值进行优化,从而得到待优化材质参数的目标值,如果差异信息满足预设条件,则将初始值作为待优化材质参数的目标值。
当拍摄图像包括多张时,渲染图像也包括多张,此时,可以将每张渲染图像和拍摄图像之间的差异信息进行相加,得到总差异信息,并根据总差异信息对初始值进行调整,得到待优化材质参数的目标值,或者根据总差异信息的平均值,对初始值进行调整,得到待优化材质参数的目标值。
相关技术中,通常是工作人员对材质参数的取值进行手动设置,工作人员在设置材质参数的取值的过程中,一般靠工作人员的主观意识判断根据材质参数渲染得到的渲染图像是否符合用户的视觉效果,如果不符合,则再次对材质参数的取值进行调整,直至根据材质参数渲染得到的渲染图像符合用户的视觉效果。
然而,通过工作人员对材质参数的取值进行手动设置,当需要设置的材质参数较多时,材质参数空间会大大增加,从而提高人工的工作量,进而降低了材质参数的取值的设置效率。
并且,靠工作人员的主观意识判断根据材质参数渲染得到的渲染图像是否符合用户的视觉效果,不同工作人员的判断结果可能不相同,从而导致材质参数的取值的准确度较低。
而在本申请实施例中,在相同的拍摄参数下,根据渲染图像与拍摄图像之间的差异信息,对待优化材质参数的初始值进行优化,实现对待优化材质参数的取值的自动设置,从而提高待优化材质参数的取值的设置效率,并且,通过根据渲染图像与拍摄图像之间的差异信息,判断待优化材质参数的初始值是否满足预设条件,使得根据待优化材质参数的目标值,对目标对象渲染后得到的图像,尽可能地接近拍摄图像,无需通过工作人员的主观意识判断,从而提高待优化材质参数的取值的准确度。
由于可以快速地确定待优化材质参数的取值以及得到的待优化材质参数的取值更加准确,因此,根据待优化材质参数的取值对目标对象进行渲染得到的渲染图像的效率和准确度更高。
在一些实施例中,为了更加准确地得到目标值,可以对初始值进行多次优化,此时,根据差异信息,对初始值进行调整,从而得到待优化材质参数的目标值,包括:
获取预设条件;
若差异信息不满足预设条件,则根据差异信息,对初始值进行调整,得到候选值;
将候选值作为待优化材质参数的初始值,并返回执行根据初始值和拍摄参数,对目标对象进行渲染的步骤;
若差异信息满足预设条件,则将初始值作为待优化材质参数的目标值。
比如,待优化材质参数的初始值设置为A1,根据A1和拍摄参数,对目标对象进行渲染,得到目标对象的渲染图像m1,根据渲染图像m1和拍摄图像,确定渲染图像m1和拍摄图像之间的差异信息d1,若d1大于预设阈值,则根据d1对A1进行调整,得到候选值A2,并根据A2和拍摄参数,对目标对象进行渲染,得到目标对象的渲染图像m2,根据渲染图像m2和拍摄图像,确定渲染图像m2和拍摄图像之间的差异信息d2,若d2小于或等于预设阈值,则将A2作为待优化材质参数的目标值。若d1小于或等于预设阈值,则将A1作为待优化材质参数的目标值。
应理解,当将候选值作为待优化材质参数的初始值时,返回执行的也可以是获取目标对象的拍摄图像,并确定拍摄图像的拍摄参数的步骤,此时,可以通过不同的拍摄图像,对待优化材质参数进行优化。
或者,由于当调整次数达到一定次数时,待优化材质参数的取值会达到收敛状态,因此,在对初始值进行多次调整的过程中,也可以根据调整次数判断是否停止优化,也即是,在得到差异信息之后,确定当前调整次数,如果当前调整次数小于预设调整次数,则根据差异信息对初始值进行调整,得到待优化材质参数的候选值,如果当前调整次数等于预设调整次数,则将初始值作为待优化材质参数的目标值。
在本申请实施例中,根据差异信息,对初始值进行多次优化,从而提高待优化材质参数的目标值的准确度。
在另一些实施例中,根据差异信息,对初始值进行调整,得到待优化材质参数的目标值,包括:
根据差异信息,确定待优化材质参数的变化量;
根据变化量,对初始值进行调整,得到待优化材质参数的目标值。
待优化材质参数的变化量可以指待优化材质参数的变化方向和变化大小中的至少一种,代表了待优化材质参数的变化趋势。其中,当变化量指待优化材质参数的变化方向时,该变化方向可以指待优化材质参数的梯度方向。
其中,待优化材质参数的变化方向可以根据差异信息对待优化材质参数进行求导得到的导数来确定,待优化材质参数的变化大小可以根据差异信息的大小来确定,比如,差异信息越大,待优化材质参数的变化幅度越大,差异信息越小,待优化材质参数的变化幅度越小。
可选地,根据变化量,对初始值进行调整,得到待优化材质参数的目标值,包括:
获取待优化材质参数对应的优化步长;
根据优化步长和变化量,对初始值进行调整,得到待优化材质参数的目标值。
在本实施例中,设置优化步长(也可以称为优化超参数),可以用于确定初始值对应的优化程度,避免错过待优化材质参数的最优值。
其中,变化量可以采用数值表示,将优化步长和变化量进行乘法融合处理,例如将二者相乘,从而得到初始值对应的优化程度,然后将优化程度和初始值进行加法融合处理,例如将二者相加,从而得到待优化材质参数的目标值。
优化步长可以是固定的,也可以根据差异信息动态地调整优化步长,本实施例在此不做限定。
应理解,当目标对象存在至少两个材质参数时,终端可以对材质参数一个一个地优化,或者,同时对材质参数进行优化。当对材质参数一个一个地优化时,可以先从目标对象的至少两个材质参数中,筛选出待优化材质参数,然后将候选材质参数的取值固定,对待优化材质参数的取值进行优化,将候选材质参数为材质参数中除了至少两个待优化材质参数之外的参数,此时,根据初始值和拍摄参数,对目标对象进行渲染,得到目标对象的渲染图像,包括:
根据初始值、候选材质参数对应的设定值以及拍摄参数,对目标对象进行渲染,得到目标对象的渲染图像。
需要说明的是,由于候选材质参数可能为已经优化过的参数,也可能为未优化过的参数,因此,候选材质参数的设定值可以指候选材质参数的初始值,也可以指候选材质参数的目标值。
比如,目标对象的材质参数为颜色材质参数和透明度材质参数,终端先将透明度材质参数作为待优化材质参数进行优化,此时,还未对颜色材质参数进行优化,因此,颜色材质参数的设定值为颜色材质参数的初始值。当对透明度材质参数的优化完成时,得到透明度材质参数的目标值,并将颜色材质参数作为待优化材质参数进行优化,此时,已经对透明度材质参数进行优化过,因此,在对颜色材质参数优化的过程中,透明度材质参数的设定值为透明度材质参数的目标值。
在另一些实施例中,从至少两个材质参数中,筛选出待优化材质参数,包括:
确定至少两个材质参数之间的关联关系;
根据关联关系,从至少两个材质参数中,筛选出待优化材质参数。
由于有些材质参数之间不会存在影响,因此,可以根据材质参数之间的关联关系,确定各个材质参数之间的优化顺序,然后根据优化顺序,从至少两个材质参数中,筛选出待优化材质参数,从而使得最终得到的目标对象的各个材质参数,取值的准确度均比较高。
其中,终端可以设置有影响的材质参数在优化顺序中排在后面,没有影响的材质参数在优化顺序中排在前面,比如,材质参数包括颜色材质参数和透明度材质参数,透明度材质参数对颜色材质参数没有影响,因此,可以先对透明度材质参数进行优化,然后再对颜色材质参数进行优化。
因为终端是一个一个地对材质参数进行优化,所以,当对待优化材质参数的优化完成时,如果还存在未优化的材质参数,则需要对未优化的材质参数进行优化,此时,在根据差异信息,对初始值进行调整,得到待优化材质参数的目标值之后,还包括:
确定材质参数的优化状态;
若材质参数的优化状态为待优化状态,则返回执行从至少两个材质参数中,筛选出待优化材质参数的步骤。
如果材质参数的优化状态为待优化状态,表明材质参数为未优化过的参数,所以,还需要对材质参数进行优化。
比如,目标对象的材质参数为颜色材质参数和透明度材质参数,终端先将透明度材质参数作为待优化材质参数进行优化,即根据透明度材质参数的初始值、拍摄参数和颜色材质参数的设定值,对目标对象进行渲染,得到渲染图像,如果渲染图像和拍摄图像之间的差异信息满足预设条件,则对透明度材质参数的优化完成,此时,还未对颜色材质参数进行优化,所以颜色材质参数的优化状态为待优化状态, 因此,将颜色材质参数作为待优化材质参数,并继续对颜色材质参数的取值进行优化。
在本申请实施例中,当目标对象的材质参数包括至少两个时,通过对材质参数一个一个地优化,从而实现对目标对象的所有材质参数的优化。
当同时对至少两个待优化材质参数进行优化时,根据渲染图像和拍摄图像,确定渲染图像和拍摄图像之间的差异信息,包括:
根据渲染图像和拍摄图像,确定每个待优化材质参数在渲染图像和拍摄图像之间的初始差异信息;
根据每个待优化材质参数对应的初始差异信息,确定渲染图像和拍摄图像之间的差异信息。
其中,终端可以根据每个待优化材质参数在渲染图像中对应的渲染值和在拍摄图像中对应的真实值,确定每个待优化材质参数在渲染图像和拍摄图像之间的初始差异信息,然后将各个初始差异信息相加,从而得到渲染图像和拍摄图像之间的差异信息。
或者,当同时对至少两个待优化材质参数进行优化时,根据渲染图像和拍摄图像,确定渲染图像和拍摄图像之间的差异信息的过程也可以为:
根据每个待优化材质参数在渲染图像中对应的渲染值,确定渲染图像的总渲染值;
根据每个待优化材质参数在拍摄图像中对应的真实值,确定拍摄图像的总真实值;
根据总渲染值和总真实值,确定渲染图像和拍摄图像之间的差异信息。
应理解,当同时对至少两个待优化材质参数进行优化时,每个材质参数对应的优化步长可以是不相同,也可以相同,本申请实施例在此不做限定。
在本实施例中,同时对目标对象的所有材质参数进行优化,减少得到每个材质参数的目标值所需的时间,从而进一步提高得到目标对象的材质参数的目标值的速度。
终端在得到待优化材质参数的目标值之后,可立即根据实际应用场景,采用目标值对目标对象进行渲染,从而得到目标对象的渲染后的图像,或者,终端也可以在预设时间间隔后,再根据实际应用场景,采用目标值对目标对象进行渲染,从而得到目标对象的渲染后的图像,本申请实施例在此不做限定。
需要说明的是,在得到目标对象的待优化材质参数的目标值之后,目标对象的待优化材质参数在不同应用场景中对应的目标值可以是相同的,但是,根据不同的应用场景,采用目标值对目标对象进行渲染后的渲染效果可能不相同。
比如,应用场景包括第一应用场景和第二应用场景,第一应用场景为月光照耀着地面,地面上存在车漆,第二应用场景为日光照耀着地面,地面上存在车漆,则根据车漆的透明度材质参数的目标值和颜色材质参数的目标值,分别对第一应用场景中的车漆进行渲染和第二应用场景中的车漆进行渲染,则车漆在第一应用场景中的渲染效果和在第二应用场景中的渲染效果不相同。
由以上可知,在本申请实施例中,获取目标对象的待优化材质参数,并对待优化材质参数进行初始化处理,得到待优化材质参数的初始值,获取目标对象的拍摄图像,并确定拍摄图像的拍摄参数,根据初始值和拍摄参数,对目标对象进行渲染,得到目标对象的渲染图像,根据渲染图像和拍摄图像,确定渲染图像和拍摄图像之间的差异信息,根据差异信息,对初始值进行调整,得到待优化材质参数的目标值,以根据目标值对目标对象进行渲染,得到目标对象的渲染后的图像,实现通过拍摄图像对待优化材质参数的初始值进行自动优化,提高对待优化材质参数进行优化的效率和准确度,从而提高根据目标值对目标对象进行渲染的效率和准确度。
根据上述实施例所描述的方法,以下将举例作进一步详细说明。
请参阅图3,图3为本申请实施例提供的图像处理方法的流程示意图。该图像处理方法由电子设备执行,例如由图1中的终端110执行。该图像处理方法的流程可以包括:
S301、终端获取目标对象的透明度材质参数和颜色材质参数,并获取透明度材质参数的初始值和颜色材质参数的初始值。
透明度材质参数(Opacity)可以为一个标量(scalar),例如采用浮点数类型,颜色材质参数(Base Color)可以为一个三维向量(vector)。终端对材质参数进行优化的过程可以包括参数设置部分和参数优化部分。
在参数设置过程中,终端可以在虚幻引擎4(UE4)中添加标量参数和向量参数。比如,虚幻引擎4的参数设置界面如图4所示(参数的取值即材质参数的初始值),终端可以在虚幻引擎4创建一个材质参数集合(Material Parameter Collection,MPC),然后在材质参数集合中设置透明度材质参数和颜色材质 参数。
终端在材质参数集合中设置好透明度材质参数的初始值和颜色材质参数的初始值之后,将材质参数集合输入至虚幻引擎4的渲染器中。
可选地,终端可以只将材质参数集合中待优化材质参数的初始值输入至虚幻引擎4的渲染器中,候选待优化材质参数可以采用默认的参数值。比如,如图5所示。当对透明度材质参数的初始值进行优化时,将透明度材质参数的初始值输入至虚幻引擎4的渲染器中,其中,图5中渲染器(Fresnel)中涉及到的数据类型包括指数(Exponenth)、基本反射小数(BaseReflectFraction)、范数(Normal)。
S302、终端确定透明度材质参数和颜色材质参数之间的关联关系,并根据关联关系,从透明度材质参数和颜色材质参数中,筛选出透明度材质参数作为待优化材质参数,颜色材质参数作为候选材质参数。
S303、终端获取目标对象的拍摄图像,并确定拍摄图像的光照度。
S304、终端根据候选材质参数的设定值、待优化材质参数的初始值和光照度,对目标对象进行渲染,得到目标对象的渲染图像。
S305、终端确定待优化材质参数在渲染图像中对应的渲染值和在拍摄图像中对应的真实值,并判断待优化材质参数是否为颜色材质参数。
S306、若待优化材质参数不是颜色材质参数,终端则确定渲染值和真实值之间的差值,并将渲染值和真实值之间的差值作为渲染图像和拍摄图像之间的差异信息。
S307、若待优化材质参数是颜色材质参数,终端则对渲染值和真实值进行颜色空间转换,得到转换后的渲染值和转换后的真实值。
比如,当渲染值为采用RGB颜色空间表示的颜色值时,可以将渲染值转换为LAB颜色空间表示的颜色值,此时,可以先将RGB颜色空间的渲染值转换至XYZ颜色空间的候选渲染值,然后再将XYZ颜色空间的候选渲染值转换至LAB颜色空间的转换后的渲染值,也即是,可以将渲染值代入以下公式中进行转换,从而得到转换后的渲染值:



其中,M表示转换矩阵,Xn、Yn和Zn表示CIE1931标准下的白色参考点,Xn=95.047,Yn=100.0,Zn=108.883,Li、ai和bi表示转换后的渲染值,表示候选渲染值,表示渲染值。f()可以通过以下公式表示:
对真实值进行转换的过程,可以参照上述转换渲染值的过程,本实施例在此不再赘述。
S308、终端确定转换后的渲染值和转换后的真实值之间的差值,并将转换后的渲染值和转换后的真实值之间的差值作为渲染图像和拍摄图像之间的差异信息。
由于颜色材质参数可以包括三个子材质参数,因此,可以分别求解每个子材质参数的转换后的渲染值和转换后的真实值的子差值,然后再将三个子差值相加,从而得到转换后的渲染值和转换后的真实值之间的差值。
也即是,可以将转换后的渲染值和转换后的真实值代入以下公式中,从而得到渲染图像和拍摄图像之间的差异信息:
ΔL=Lr-Li
Δa=ar-ai
Δb=br-bi
其中,ΔL表示亮度子材质参数的转换后的渲染值和转换后的真实值的子差值,Δa红色绿色子材质参数的转换后的渲染值和转换后的真实值的子差值,Δb表示黄色蓝色子材质参数的转换后的渲染值和转换后的真实值的子差值,ΔE表示转换后的渲染值和转换后的真实值之间的差值,Lr、ar和br表示转换后的真实值。
S309、若差异信息大于预设阈值,终端则求解差异信息针对待优化材质参数的梯度,并获取待优化材质参数对应的优化步长。
S3010、终端根据待优化材质参数的梯度和待优化材质参数对应的优化步长,对待优化材质参数的初始值进行调整,得到候选值。
终端可以将待优化材质参数的梯度和待优化材质参数对应的优化步长进行乘法融合处理,从而得到待优化材质参数的优化程度,然后在将优化程度和初始值进行加法融合处理,从而得到待优化材质参数的优化后的候选值。
比如,当待优化材质参数为颜色材质参数时,可以将差异信息代入以下公式中进行融合处理,从而得到待优化材质参数的优化后的候选值:


其中,Lk+1、ak+1和bk+1表示待优化材质参数的优化后的候选值,Lk、ak和bk表示待优化材质参数的初始值,表示差异信息针对亮度子材质参数的梯度,α表示亮度子材质参数的优化步长,表示差异信息针对红色绿色子材质参数的梯度,β表示红色绿色子材质参数的优化步长,表示差异信息针对黄色蓝色子材质参数的梯度,γ表示黄色蓝色子材质参数的优化步长,k表示调整次数。
S3011、终端将候选值作为待优化材质参数的初始值,并返回执行步骤S303。
S3012、若差异信息小于或等于预设阈值,则将初始值作为待优化材质参数的目标值。
本申请实施例中差异信息也可以理解为损失函数值,即将渲染值和真实值代入损失函数中进行计算,从而得到损失函数值,并将损失函数值作为差异信息。
比如,当待优化材质参数为颜色材质参数时,本申请实施例的材质参数的优化过程可以如图6所示,终端在得到待优化材质参数的初始值后,将初始值输入至虚幻引擎中对目标对象进行渲染,得到渲染图像,然后分别对渲染图像的渲染值和拍摄图像的真实值进行颜色空间转换,得到转换后的渲染值和转换后的真实值,接着将转换后的渲染值和转换后的真实值代入损失函数中,计算得到损失函数值,并根据损失函数值计算待优化材质参数的梯度,根据梯度下降优化待优化材质参数的初始值,得到候选值,最 后继续对候选值进行优化,直至损失函数值满足预设条件。
S3013、终端确定颜色材质参数的优化状态。
S3014、若颜色材质参数的优化状态为待优化状态,终端则将颜色材质参数作为待优化材质参数,将透明度材质参数作为候选材质参数,并返回执行步骤S303。
S3015、若颜色材质参数的优化状态为已优化状态,终端则可以根据颜色材质参数的目标值和透明度材质参数的目标值,对目标对象进行渲染,得到目标对象的渲染图像。
在本申请实施例中,计算在相同光照环境下渲染图像与拍摄图像之间的差异,再计算差异相对于待优化材质参数的导数,并进行梯度下降优化,在进行若干轮迭代计算之后,可以得到最接近拍摄图像的渲染图像,从而可以得到最接近拍摄图像时材质参数的目标值。
下面对本申请实施例的效果进行说明。
本申请实施例中,在光照度为280勒克斯、210勒克斯、160勒克斯、110勒克斯以及65勒克斯时,对车漆进行拍摄,得到车漆的拍摄图像,然后在280勒克斯、210勒克斯、160勒克斯、110勒克斯以及65勒克斯下,根据拍摄图像对透明度材质参数进行优化,当透明度材质参数的初始值为0.2、渲染图像和拍摄图像之间的差异信息为4.243824并小于预设阈值时,将初始值0.2作为透明度材质参数的目标值。
同理地,终端在得到透明度材质参数的目标值之后,在280勒克斯、210勒克斯、160勒克斯、110勒克斯、65勒克斯以及透明度材质参数的目标值为0.2下,根据拍摄图像对颜色材质参数进行优化,当颜色材质参数的初始值为0.86,渲染图像和拍摄图像之间的差异信息为3.043818并小于预设阈值时,将初始值0.86作为颜色材质参数的目标值。
为便于更好的实施本申请实施例提供的图像处理方法,本申请实施例还提供一种基于上述图像处理方法的装置。其中名词的含义与上述图像处理方法中相同,具体实现细节可以参考方法实施例中的说明。
例如,如图7所示,该图像处理装置可以包括:
参数获取模块701,用于获取目标对象的待优化材质参数,并获取待优化材质参数的初始值;
图像获取模块702,用于获取目标对象的拍摄图像,并确定拍摄图像的拍摄参数;
对象渲染模块703,用于根据初始值和拍摄参数,对目标对象进行渲染,得到目标对象的渲染图像;
信息确定模块704,用于根据渲染图像和拍摄图像,确定渲染图像和拍摄图像之间的差异信息;
参数优化模块705,用于根据差异信息,对初始值进行调整,得到待优化材质参数的目标值,以根据目标值对目标对象进行渲染,得到目标对象的渲染后的图像。
可选地,信息确定模块704具体用于执行:
根据渲染图像,确定待优化材质参数在渲染图像中对应的渲染值;
根据拍摄图像,确定待优化材质参数在拍摄图像中对应的真实值;
根据渲染值和真实值,确定渲染图像和拍摄图像之间的差异信息。
可选地,待优化材质参数包括至少两个子材质参数,渲染值包括至少两个子材质参数中每个子材质参数的渲染值,真实值包括每个子材质参数的真实值。
相应地,信息确定模块704具体用于执行:
确定每个子材质参数的真实值和该子材质参数的渲染值之间的子差异信息;
根据各个子差异信息,确定渲染图像和拍摄图像之间的差异信息。
可选地,参数优化模块705具体用于执行:
获取预设条件;
若差异信息不满足预设条件,则根据差异信息,对初始值进行调整,得到候选值;
将候选值作为待优化材质参数的初始值,并返回执行根据初始值和拍摄参数,对目标对象进行渲染的步骤;
若差异信息满足预设条件,则将初始值作为待优化材质参数的目标值。
可选地,参数优化模块705具体用于执行:
根据差异信息,确定待优化材质参数的变化量;
根据变化量,对初始值进行调整,得到待优化材质参数的目标值。
可选地,参数优化模块705具体用于执行:
获取待优化材质参数对应的优化步长;
根据优化步长和变化量,确定初始值对应的优化程度;
根据优化程度,对初始值进行调整,得到目标值。
可选地,待优化材质参数包括颜色材质参数。
相应地,信息确定模块704具体用于执行:
对渲染图像进行颜色空间转换,得到转换后的渲染图像;
对拍摄图像进行颜色空间转换,得到转换后的拍摄图像;
根据转换后的渲染图像和转换后的拍摄图像,确定渲染图像和拍摄图像之间的差异信息。
可选地,参数获取模块701具体用于执行:
获取目标对象的至少两个材质参数;
从至少两个材质参数中,筛选出待优化材质参数。
相应地,对象渲染模块703具体用于执行:
根据初始值、候选材质参数对应的设定值以及拍摄参数,对目标对象进行渲染,得到目标对象的渲染图像,其中,候选材质参数为至少两个材质参数中除了待优化材质参数之外的参数。
可选地,参数获取模块701具体用于执行:
确定至少两个材质参数之间的关联关系;
根据关联关系,从至少两个材质参数中,筛选出待优化材质参数。
可选地,参数获取模块701还用于执行:
确定材质参数的优化状态;
若材质参数的优化状态为待优化状态,则返回执行从至少两个材质参数中,筛选出待优化材质参数的步骤。
可选地,当对至少两个所述待优化材质参数进行优化时,信息确定模块704具体用于执行:
根据渲染图像和拍摄图像,确定每个待优化材质参数在渲染图像和拍摄图像之间的初始差异信息;
根据每个待优化材质参数对应的初始差异信息,确定渲染图像和拍摄图像之间的差异信息。
具体实施时,以上各个模块可以作为独立的实体来实现,也可以进行任意组合,作为同一或若干个实体来实现,以上各个模块的具体实施方式以及对应的有益效果可参见前面的方法实施例,在此不再赘述。
本申请实施例还提供一种电子设备,该电子设备可以是服务器或终端等,如图8所示,其示出了本申请实施例所涉及的电子设备的结构示意图。其中,该电子设备可以包括一个或者一个以上处理核心的处理器801、一个或一个以上计算机可读存储介质的存储器802、电源803和输入单元804等部件。本领域技术人员可以理解,图8中示出的电子设备结构并不构成对电子设备的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
处理器801是该电子设备的控制中心,利用各种接口和线路连接整个电子设备的各个部分,通过运行或执行存储在存储器802内的计算机程序和/或模块,以及调用存储在存储器802内的数据,执行电子设备的各种功能和处理数据。可选的,处理器801可包括一个或多个处理核心;优选的,处理器801可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器801中。
存储器802可用于存储计算机程序以及模块,处理器801通过运行存储在存储器802的计算机程序以及模块,从而执行各种功能应用以及数据处理。存储器802可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的计算机程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据电子设备的使用所创建的数据等。此外,存储器802可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。相应地,存储器802还可以包括存储器控制器,以提供处理器801对存储器802的访问。
电子设备还包括给各个部件供电的电源803,优选的,电源803可以通过电源管理系统与处理器801逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。电源803还可以包括一个或一个以上的直流或交流电源、再充电系统、电源故障检测电路、电源转换器或者逆变器、电源状态指示器等任意组件。
该电子设备还可包括输入单元804,该输入单元804可用于接收输入的数字或字符信息,以及产生与用户设置以及功能控制有关的键盘、鼠标、操作杆、光学或者轨迹球信号输入。
尽管未示出,电子设备还可以包括显示单元等,在此不再赘述。具体在本实施例中,电子设备中的处理器801会按照如下的指令,将一个或一个以上的计算机程序的进程对应的可执行文件加载到存储器802中,并由处理器801来运行存储在存储器802中的计算机程序,从而实现各种功能,比如:
获取目标对象的待优化材质参数,并获取待优化材质参数的初始值;
获取目标对象的拍摄图像,并确定拍摄图像的拍摄参数;
根据初始值和拍摄参数,对目标对象进行渲染,得到目标对象的渲染图像;
根据渲染图像和拍摄图像,确定渲染图像和拍摄图像之间的差异信息;
根据差异信息,对初始值进行调整,得到待优化材质参数的目标值,以根据目标值对目标对象进行渲染,得到目标对象的渲染后的图像。
以上各个操作的具体实施方式以及对应的有益效果可参见上文对图像处理方法的详细描述,在此不作赘述。
本领域普通技术人员可以理解,上述实施例的各种方法中的全部或部分步骤可以通过计算机程序来完成,或通过计算机程序控制相关的硬件来完成,该计算机程序可以存储于一计算机可读存储介质中,并由处理器进行加载和执行。
为此,本申请实施例提供一种计算机可读存储介质,其中存储有计算机程序,该计算机程序能够被处理器进行加载,以执行本申请实施例所提供的任一种图像处理方法中的步骤。例如,该计算机程序可以执行如下步骤:
获取目标对象的待优化材质参数,并获取待优化材质参数的初始值;
获取目标对象的拍摄图像,并确定拍摄图像的拍摄参数;
根据初始值和拍摄参数,对目标对象进行渲染,得到目标对象的渲染图像;
根据渲染图像和拍摄图像,确定渲染图像和拍摄图像之间的差异信息;
根据差异信息,对初始值进行调整,得到待优化材质参数的目标值,以根据目标值对目标对象进行渲染,得到目标对象的渲染后的图像。
以上各个操作的具体实施方式以及对应的有益效果可参见前面的实施例,在此不再赘述。
其中,该计算机可读存储介质可以包括:只读存储器(ROM,Read Only Memory)、随机存取记忆体(RAM,Random Access Memory)、磁盘或光盘等。
由于该计算机可读存储介质中所存储的计算机程序,可以执行本申请实施例所提供的任一种图像处理方法中的步骤,因此,可以实现本申请实施例所提供的任一种图像处理方法所能实现的有益效果,详见前面的实施例,在此不再赘述。
其中,根据本申请的一个方面,提供了一种计算机程序产品或计算机程序,该计算机程序产品或计算机程序包括计算机指令,该计算机指令存储在计算机可读存储介质中。计算机设备的处理器从计算机可读存储介质读取该计算机指令,处理器执行该计算机指令,使得该计算机设备执行上述图像处理方法。
以上对本申请实施例所提供的一种图像处理方法、装置、设备、存储介质及程序产品进行了详细介绍,本文中应用了具体个例对本申请的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本申请的方法及其核心思想;同时,对于本领域的技术人员,依据本申请的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本申请的限制。

Claims (20)

  1. 一种图像处理方法,由电子设备执行,包括:
    获取目标对象的待优化材质参数,并获取所述待优化材质参数的初始值;
    获取所述目标对象的拍摄图像,并确定所述拍摄图像的拍摄参数;
    根据所述初始值和所述拍摄参数,对所述目标对象进行渲染,得到所述目标对象的渲染图像;
    根据所述渲染图像和所述拍摄图像,确定所述渲染图像和所述拍摄图像之间的差异信息;及,
    根据所述差异信息,对所述初始值进行调整,得到所述待优化材质参数的目标值,以根据所述目标值对所述目标对象进行渲染,得到所述目标对象的渲染后的图像。
  2. 根据权利要求1所述的图像处理方法,其中,所述根据所述渲染图像和所述拍摄图像,确定所述渲染图像和所述拍摄图像之间的差异信息,包括:
    根据所述渲染图像,确定所述待优化材质参数在所述渲染图像中对应的渲染值;
    根据所述拍摄图像,确定所述待优化材质参数在所述拍摄图像中对应的真实值;
    根据所述渲染值和所述真实值,确定所述差异信息。
  3. 根据权利要求2所述的图像处理方法,其中,所述待优化材质参数包括至少两个子材质参数,所述渲染值包括所述至少两个子材质参数中每个子材质参数的渲染值,所述真实值包括每个子材质参数的真实值;
    所述根据所述渲染值和所述真实值,确定所述差异信息,包括:
    确定每个子材质参数的真实值和该子材质参数的渲染值之间的子差异信息;
    根据各个所述子差异信息,确定所述差异信息。
  4. 根据权利要求1所述的图像处理方法,其中,所述根据所述差异信息,对所述初始值进行调整,得到所述待优化材质参数的目标值,包括:
    获取预设条件;
    若所述差异信息不满足所述预设条件,则根据所述差异信息,对所述初始值进行调整,得到候选值;
    将所述候选值作为所述待优化材质参数的初始值,并返回执行根据所述初始值和所述拍摄参数,对所述目标对象进行渲染的步骤;
    若所述差异信息满足所述预设条件,则将所述初始值作为所述目标值。
  5. 根据权利要求1所述的图像处理方法,其中,所述根据所述差异信息,对所述初始值进行调整,得到所述待优化材质参数的目标值,包括:
    根据所述差异信息,确定所述待优化材质参数的变化量;
    根据所述变化量,对所述初始值进行调整,得到所述目标值。
  6. 根据权利要求5所述的图像处理方法,其中,所述根据所述变化量,对所述初始值进行调整,得到所述目标值,包括:
    获取所述待优化材质参数对应的优化步长;
    根据所述优化步长和所述变化量,对所述初始值进行调整,得到所述目标值。
  7. 根据权利要求1所述的图像处理方法,其中,所述待优化材质参数包括颜色材质参数;
    所述根据所述渲染图像和所述拍摄图像,确定所述渲染图像和所述拍摄图像之间的差异信息,包括:
    对所述渲染图像进行颜色空间转换,得到转换后的渲染图像;
    对所述拍摄图像进行颜色空间转换,得到转换后的拍摄图像;
    根据所述转换后的渲染图像和所述转换后的拍摄图像,确定所述差异信息。
  8. 根据权利要求1-7中任一项所述的图像处理方法,其中,所述获取目标对象的待优化材质参数,包括:
    获取所述目标对象的至少两个材质参数;
    从所述至少两个所述材质参数中,筛选出所述待优化材质参数;
    所述根据所述初始值和所述拍摄参数,对所述目标对象进行渲染,得到所述目标对象的渲染图像,包括:
    根据所述初始值、候选材质参数对应的设定值以及所述拍摄参数,对所述目标对象进行渲染,得到所述目标对象的渲染图像,其中,所述候选材质参数为所述至少两个材质参数中除了待优化材质参数之外的参数。
  9. 根据权利要求8所述的图像处理方法,其中,所述从所述至少两个所述材质参数中,筛选出所述待优化材质参数,包括:
    确定所述至少两个材质参数之间的关联关系;
    根据所述关联关系,从所述至少两个材质参数中,筛选出所述待优化材质参数。
  10. 根据权利要求8所述的图像处理方法,其中,在所述根据所述差异信息,对所述初始值进行调整,得到所述待优化材质参数的目标值之后,还包括:
    确定所述材质参数的优化状态;
    若所述材质参数的优化状态为待优化状态,则返回执行从所述至少两个材质参数中,筛选出所述待优化材质参数的步骤。
  11. 根据权利要求1所述的图像处理方法,其中,当对至少两个所述待优化材质参数进行优化时,所述根据所述渲染图像和所述拍摄图像,确定所述渲染图像和所述拍摄图像之间的差异信息,包括:
    根据所述渲染图像和所述拍摄图像,确定每个所述待优化材质参数在所述渲染图像和所述拍摄图像之间的初始差异信息;
    根据每个所述待优化材质参数对应的初始差异信息,确定所述差异信息。
  12. 一种图像处理装置,包括:
    参数获取模块,用于获取目标对象的待优化材质参数,并获取所述待优化材质参数的初始值;
    图像获取模块,用于获取所述目标对象的拍摄图像,并确定所述拍摄图像的拍摄参数;
    对象渲染模块,用于根据所述初始值和所述拍摄参数,对所述目标对象进行渲染,得到所述目标对象的渲染图像;
    信息确定模块,用于根据所述渲染图像和所述拍摄图像,确定所述渲染图像和所述拍摄图像之间的差异信息;及,
    参数优化模块,用于根据所述差异信息,对所述初始值进行调整,得到所述待优化材质参数的目标值,以根据所述目标值对所述目标对象进行渲染,得到所述目标对象的渲染后的图像。
  13. 根据权利要求12所述的图像处理装置,其中,所述信息确定模块用于,根据所述渲染图像,确定所述待优化材质参数在所述渲染图像中对应的渲染值;根据所述拍摄图像,确定所述待优化材质参数在所述拍摄图像中对应的真实值;根据所述渲染值和所述真实值,确定所述差异信息。
  14. 根据权利要求12所述的图像处理装置,其中,所述参数优化模块用于,获取预设条件;若所述差异信息不满足所述预设条件,则根据所述差异信息,对所述初始值进行调整,得到候选值;将所述候选值作为所述待优化材质参数的初始值,并返回执行根据所述初始值和所述拍摄参数,对所述目标对象进行渲染的步骤;若所述差异信息满足所述预设条件,则将所述初始值作为所述目标值。
  15. 根据权利要求12所述的图像处理装置,其中,所述参数优化模块用于,根据所述差异信息,确定所述待优化材质参数的变化量;根据所述变化量,对所述初始值进行调整,得到所述目标值。
  16. 根据权利要求12所述的图像处理装置,其中,所述待优化材质参数包括颜色材质参数;所述信息确定模块用于,对所述渲染图像进行颜色空间转换,得到转换后的渲染图像;对所述拍摄图像进行颜色空间转换,得到转换后的拍摄图像;根据所述转换后的渲染图像和所述转换后的拍摄图像,确定所述差异信息。
  17. 根据权利要求12-16中任一项所述的图像处理装置,其中,所述参数获取模块用于,获取所述目标对象的至少两个材质参数;从所述至少两个所述材质参数中,筛选出所述待优化材质参数;
    所述对象渲染模块用于,根据所述初始值、候选材质参数对应的设定值以及所述拍摄参数,对所述目标对象进行渲染,得到所述目标对象的渲染图像,其中,所述候选材质参数为所述至少两个材质参数中除了待优化材质参数之外的参数。
  18. 一种电子设备,包括处理器和存储器,所述存储器存储有计算机程序,所述处理器用于运行所述存储器内的计算机程序,以执行权利要求1至11中任一项所述的图像处理方法。
  19. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序适于 处理器进行加载,以执行权利要求1至11中任一项所述的图像处理方法。
  20. 一种计算机程序产品,所述计算机程序产品存储有计算机程序,所述计算机程序适于处理器进行加载,以执行权利要求1至11中任一项所述的图像处理方法。
PCT/CN2023/123514 2022-11-25 2023-10-09 图像处理方法、装置、设备、存储介质及程序产品 WO2024109357A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211493839.4A CN118096966A (zh) 2022-11-25 2022-11-25 图像处理方法、装置、设备、存储介质及程序产品
CN202211493839.4 2022-11-25

Publications (1)

Publication Number Publication Date
WO2024109357A1 true WO2024109357A1 (zh) 2024-05-30

Family

ID=91159060

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/123514 WO2024109357A1 (zh) 2022-11-25 2023-10-09 图像处理方法、装置、设备、存储介质及程序产品

Country Status (2)

Country Link
CN (1) CN118096966A (zh)
WO (1) WO2024109357A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110428388A (zh) * 2019-07-11 2019-11-08 阿里巴巴集团控股有限公司 一种图像数据生成方法及装置
US20210241519A1 (en) * 2019-03-05 2021-08-05 Facebook Technologies, Llc Inverse Path Tracing for Material and Lighting Estimation
CN113822977A (zh) * 2021-06-28 2021-12-21 腾讯科技(深圳)有限公司 图像渲染方法、装置、设备以及存储介质
CN114494494A (zh) * 2022-01-24 2022-05-13 广州繁星互娱信息科技有限公司 图像光照参数的调整方法和装置、存储介质及电子设备
CN115272548A (zh) * 2022-07-29 2022-11-01 商汤国际私人有限公司 渲染参数调整方法、装置、电子设备及存储介质
CN115375816A (zh) * 2022-09-06 2022-11-22 清华大学 可微渲染方法、装置、电子设备及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210241519A1 (en) * 2019-03-05 2021-08-05 Facebook Technologies, Llc Inverse Path Tracing for Material and Lighting Estimation
CN110428388A (zh) * 2019-07-11 2019-11-08 阿里巴巴集团控股有限公司 一种图像数据生成方法及装置
CN113822977A (zh) * 2021-06-28 2021-12-21 腾讯科技(深圳)有限公司 图像渲染方法、装置、设备以及存储介质
CN114494494A (zh) * 2022-01-24 2022-05-13 广州繁星互娱信息科技有限公司 图像光照参数的调整方法和装置、存储介质及电子设备
CN115272548A (zh) * 2022-07-29 2022-11-01 商汤国际私人有限公司 渲染参数调整方法、装置、电子设备及存储介质
CN115375816A (zh) * 2022-09-06 2022-11-22 清华大学 可微渲染方法、装置、电子设备及存储介质

Also Published As

Publication number Publication date
CN118096966A (zh) 2024-05-28

Similar Documents

Publication Publication Date Title
CN112116692B (zh) 模型渲染方法、装置、设备
US9183654B2 (en) Live editing and integrated control of image-based lighting of 3D models
US20200344411A1 (en) Context-aware image filtering
WO2021228031A1 (zh) 渲染方法、设备以及系统
US20170332009A1 (en) Devices, systems, and methods for a virtual reality camera simulator
US20240087219A1 (en) Method and apparatus for generating lighting image, device, and medium
CN110333924A (zh) 一种图像渐变调整方法、装置、设备及存储介质
CN111476910B (zh) 智能建筑bim的3d模型显示方法、系统、介质及显示终端
CN112370783A (zh) 虚拟对象渲染方法、装置、计算机设备和存储介质
CN111080746A (zh) 图像处理方法、装置、电子设备和存储介质
CN116758208A (zh) 全局光照渲染方法、装置、存储介质及电子设备
CN115100337A (zh) 一种基于卷积神经网络的全身人像视频重照明方法和装置
CN114494024B (zh) 图像渲染方法、装置、设备及存储介质
WO2024109357A1 (zh) 图像处理方法、装置、设备、存储介质及程序产品
CN114359021A (zh) 渲染画面的处理方法、装置、电子设备及介质
CN114359458A (zh) 一种图像渲染方法、装置、设备、存储介质及程序产品
CN108932703A (zh) 图片处理方法、图片处理装置及终端设备
US12131482B2 (en) Learning apparatus, foreground region estimation apparatus, learning method, foreground region estimation method, and program
CN116912387A (zh) 纹理贴图的处理方法及装置、电子设备、存储介质
CN114764821A (zh) 移动物体检测方法、装置、电子设备和存储介质
CN108765574A (zh) 3d场景拟真方法及系统和计算机可读存储介质
CN114367105A (zh) 模型着色方法、装置、设备、介质和程序产品
CN114245907A (zh) 自动曝光的光线追踪
US11380048B2 (en) Method and system for determining a spectral representation of a color
CN110223363A (zh) 图像生成方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23893449

Country of ref document: EP

Kind code of ref document: A1