Nothing Special   »   [go: up one dir, main page]

CN114463528B - Target visual angle playing method and device - Google Patents

Target visual angle playing method and device Download PDF

Info

Publication number
CN114463528B
CN114463528B CN202210134505.1A CN202210134505A CN114463528B CN 114463528 B CN114463528 B CN 114463528B CN 202210134505 A CN202210134505 A CN 202210134505A CN 114463528 B CN114463528 B CN 114463528B
Authority
CN
China
Prior art keywords
target
picture
playing
model
perspectives
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210134505.1A
Other languages
Chinese (zh)
Other versions
CN114463528A (en
Inventor
息婧怡
李曼曼
韩宝健
王旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Xumi Yuntu Space Technology Co Ltd
Original Assignee
Shenzhen Xumi Yuntu Space Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Xumi Yuntu Space Technology Co Ltd filed Critical Shenzhen Xumi Yuntu Space Technology Co Ltd
Priority to CN202210134505.1A priority Critical patent/CN114463528B/en
Publication of CN114463528A publication Critical patent/CN114463528A/en
Application granted granted Critical
Publication of CN114463528B publication Critical patent/CN114463528B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Architecture (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure relates to the technical field of picture processing, and provides a target visual angle playing method and device. The method comprises the following steps: acquiring a target model and determining an application scene corresponding to the target model; generating a plurality of target view angles for the target model according to the application scene, wherein the plurality of target view angles comprise: a plurality of model perspectives and a plurality of path perspectives; generating a button tree corresponding to a plurality of target visual angles, and associating the plurality of target visual angles with the button tree, wherein the button tree comprises a plurality of buttons, and one button corresponds to one target visual angle; and playing a plurality of target visual angles according to a preset playing rule. By adopting the technical means, the problems of complex presentation process and low efficiency in the prior art that a view of a three-dimensional model is presented in a multi-dimensional and omnibearing manner are solved.

Description

Target visual angle playing method and device
Technical Field
The disclosure relates to the technical field of picture processing, and in particular relates to a target visual angle playing method and device.
Background
At present, a view of a three-dimensional model is displayed in a multi-dimensional and omnibearing manner, the view is realized through various different types of software, the realization process is complex, and the efficiency is low. For example, if the prior art is used to perform multiparty communication and display of a product, at least four types of software (skichup+ AutoRevit +autopad+office) are needed to complete the process, the display content modification efficiency is extremely low, the process operation is complex, and the view space expressive force is poor.
Disclosure of Invention
In view of this, the embodiments of the present disclosure provide a target viewing angle playing method, apparatus, electronic device, and computer readable storage medium, so as to solve the problems in the prior art that a view of a three-dimensional model is displayed in multiple dimensions and in all directions, and the rendering process is complex and the efficiency is low.
In a first aspect of an embodiment of the present disclosure, a target viewing angle playing method is provided, including: acquiring a target model and determining an application scene corresponding to the target model; generating a plurality of target view angles for the target model according to the application scene, wherein the plurality of target view angles comprise: a plurality of model perspectives and a plurality of path perspectives; generating a button tree corresponding to a plurality of target visual angles, and associating the plurality of target visual angles with the button tree, wherein the button tree comprises a plurality of buttons, and one button corresponds to one target visual angle; and playing a plurality of target visual angles according to a preset playing rule.
In a second aspect of the embodiments of the present disclosure, there is provided a target viewing angle playing device, including: the acquisition module is configured to acquire a target model and determine an application scene corresponding to the target model; the generating module is configured to generate a plurality of target view angles for a target model according to an application scene, wherein the plurality of target view angles comprise: a plurality of model perspectives and a plurality of path perspectives; the association module is configured to generate a button tree corresponding to a plurality of target visual angles and associate the plurality of target visual angles with the button tree, wherein the button tree comprises a plurality of buttons, and one button corresponds to one target visual angle; and the playing module is configured to play a plurality of target visual angles according to a preset playing rule.
In a third aspect of the disclosed embodiments, an electronic device is provided, comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the above method when executing the computer program.
In a fourth aspect of the disclosed embodiments, a computer-readable storage medium is provided, which stores a computer program which, when executed by a processor, implements the steps of the above-described method.
Compared with the prior art, the embodiment of the disclosure has the beneficial effects that: because the embodiment of the disclosure determines the application scene corresponding to the target model by acquiring the target model; generating a plurality of target view angles for the target model according to the application scene, wherein the plurality of target view angles comprise: a plurality of model perspectives and a plurality of path perspectives; generating a button tree corresponding to a plurality of target visual angles, and associating the plurality of target visual angles with the button tree, wherein the button tree comprises a plurality of buttons, and one button corresponds to one target visual angle; according to the preset playing rules, a plurality of target visual angles are played, so that the problems of complex and low efficiency of the presenting process of the view of the three-dimensional model in a multi-dimensional and omnibearing manner in the prior art can be solved by adopting the technical means, the process of presenting the view of the three-dimensional model is simplified, and the efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings that are required for the embodiments or the description of the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and other drawings may be obtained according to these drawings without inventive effort for a person of ordinary skill in the art.
Fig. 1 is a scene schematic diagram of an application scene of an embodiment of the present disclosure;
Fig. 2 is a schematic diagram of a target view playing method according to an embodiment of the present disclosure;
Fig. 3 is a flowchart illustrating a target view playing method according to an embodiment of the present disclosure
Fig. 4 is a schematic structural diagram of a target view playing device according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system configurations, techniques, etc. in order to provide a thorough understanding of the disclosed embodiments. However, it will be apparent to one skilled in the art that the present disclosure may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present disclosure with unnecessary detail.
In the process of implementing the disclosed concept, the inventor finds that at least the following technical problems exist in the related art: and a view of a three-dimensional model is displayed in a multi-dimension and omnibearing manner, and the problems of complex display process and low efficiency are solved. According to the embodiment of the disclosure, the plurality of target view angles are generated by taking the application scene as the target model, and the plurality of target view angles are played according to the preset playing rule, so that the problems of complex and low presenting process and efficiency of presenting a view of a three-dimensional model in a multi-dimensional and omnibearing manner in the prior art can be solved, the process of presenting the view of the three-dimensional model is simplified, and the efficiency is improved.
A method and apparatus for playing a target view according to embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings.
Fig. 1 is a scene diagram of an application scene of an embodiment of the present disclosure. The application scenario may include terminal devices 1,2 and 3, a server 4 and a network 5.
The terminal devices 1,2 and 3 may be hardware or software. When the terminal devices 1,2 and 3 are hardware, they may be various electronic devices having a display screen and supporting communication with the server 4, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like; when the terminal apparatuses 1,2, and 3 are software, they can be installed in the electronic apparatus as above. The terminal devices 1,2 and 3 may be implemented as a plurality of software or software modules, or as a single software or software module, to which the embodiments of the present disclosure are not limited. Further, various applications, such as a data processing application, an instant messaging tool, social platform software, a search class application, a shopping class application, and the like, may be installed on the terminal devices 1,2, and 3.
The server 4 may be a server that provides various services, for example, a background server that receives a request transmitted from a terminal device with which communication connection is established, and the background server may perform processing such as receiving and analyzing the request transmitted from the terminal device and generate a processing result. The server 4 may be a server, a server cluster formed by a plurality of servers, or a cloud computing service center, which is not limited in the embodiment of the present disclosure.
The server 4 may be hardware or software. When the server 4 is hardware, it may be various electronic devices that provide various services to the terminal devices 1,2, and 3. When the server 4 is software, it may be a plurality of software or software modules providing various services to the terminal devices 1,2, and 3, or may be a single software or software module providing various services to the terminal devices 1,2, and 3, which is not limited by the embodiments of the present disclosure.
The network 5 may be a wired network using coaxial cable, twisted pair wire, and optical fiber connection, or may be a wireless network that can implement interconnection of various Communication devices without wiring, for example, bluetooth (Bluetooth), near Field Communication (NFC), infrared (Infrared), etc., which are not limited by the embodiments of the present disclosure.
The user can establish a communication connection with the server 4 via the network 5 through the terminal devices 1, 2, and 3 to receive or transmit information or the like. It should be noted that the specific types, numbers and combinations of the terminal devices 1, 2 and 3, the server 4 and the network 5 may be adjusted according to the actual requirements of the application scenario, which is not limited by the embodiment of the present disclosure.
Fig. 2 is a schematic diagram of a target view playing method according to an embodiment of the present disclosure. The target view playing method of fig. 2 may be performed by the terminal device or the server of fig. 1. As shown in fig. 2, the target viewing angle playing method includes:
s201, acquiring a target model, and determining an application scene corresponding to the target model;
S202, generating a plurality of target view angles for a target model according to an application scene, wherein the plurality of target view angles comprise: a plurality of model perspectives and a plurality of path perspectives;
s203, generating a button tree corresponding to a plurality of target visual angles, and associating the plurality of target visual angles with the button tree, wherein the button tree comprises a plurality of buttons, and one button corresponds to one target visual angle;
s204, playing a plurality of target visual angles according to a preset playing rule.
The target view angles of the target models to be presented are different in different application scenes, the categories of the target view angles of the target models to be presented are model view angles in some application scenes, and the categories of the target view angles of the target models to be presented are path view angles in some application scenes. Application scenes such as a product design scene, a product display scene and the like. A target view may be understood as a picture or a group of pictures. One model view may be one picture and one path view may be a view containing multiple pictures. For example, in a product design scenario, where a slow display of the product is required, multiple model perspectives should be used; in a product display scenario, a product can be displayed faster, and multiple path views should be used. Of course, no matter what scene, the model view angle and the path view angle can be combined and utilized, and the effect is better by combining the display of the model view angle and the path view angle. The product can be an object created by people or can be a natural resource. The button tree is a series of empty buttons which are not in operation link, and a plurality of target visual angles are associated with the buttons in the button tree, so that the display process of the plurality of target visual angles can be controlled through the buttons. The play target view may be a picture in the play target view.
According to the technical scheme provided by the embodiment of the disclosure, the application scene corresponding to the target model is determined by acquiring the target model; generating a plurality of target view angles for the target model according to the application scene, wherein the plurality of target view angles comprise: a plurality of model perspectives and a plurality of path perspectives; generating a button tree corresponding to a plurality of target visual angles, and associating the plurality of target visual angles with the button tree, wherein the button tree comprises a plurality of buttons, and one button corresponds to one target visual angle; according to the preset playing rules, a plurality of target visual angles are played, so that the problems of complex and low efficiency of the presenting process of the view of the three-dimensional model in a multi-dimensional and omnibearing manner in the prior art can be solved by adopting the technical means, the process of presenting the view of the three-dimensional model is simplified, and the efficiency is improved.
In step 202, a plurality of target perspectives are generated for a target model according to an application scene, where the plurality of target perspectives includes: a plurality of model perspectives and a plurality of path perspectives, comprising: generating a path view angle for the target model according to the application scene comprises: acquiring pictures of a target model under a plurality of angles; and generating a storage path corresponding to the target model, and storing pictures of the target model under a plurality of angles into the storage path to generate a path view angle.
One model view may be one picture and one path view may be a view containing multiple pictures. Generating a plurality of target view angles for the target model according to the application scene, and judging the category of the target view angles according to the application scene, namely whether the target view angles generated for the target model are model view angles or path view angles or both. After judging the category of the target visual angle, acquiring pictures of the target model under a plurality of angles, and generating a model visual angle or a path visual angle according to the plurality of pictures.
If the path view angle is generated for the target model, the path view angle is obtained by generating a storage path corresponding to the target model, namely, a storage path corresponding to the path view angle, and then storing pictures of the target model under a plurality of angles into the storage path. If the path view is required to be edited, deleting or adding new pictures in a plurality of pictures in a storage path corresponding to the path view.
If the model view angle is generated for the target model, each picture of the acquired target model can be directly used as one model view angle.
In step 202, a plurality of target perspectives are generated for a target model according to an application scene, where the plurality of target perspectives includes: a plurality of model perspectives and a plurality of path perspectives, comprising: acquiring element information corresponding to the target model, wherein the element information comprises: a plurality of elements on the object model, and positional information of each element; generating a plurality of snapshots corresponding to the target model according to the element information; and generating a plurality of target visual angles for the target model according to the application scene and the plurality of snapshots.
In the above embodiment, the image of the object model under multiple angles is acquired by the image acquisition device, and the embodiment of the disclosure provides a method for directly generating multiple object views for the object model without the need of the image acquisition device to acquire the image of the object model. Acquiring element information corresponding to the target model, wherein one element in the element information can be a key point of the target model, such as an inflection point, a vertex and the like; the position information of each element may be a spatial coordinate system established for the space where the target model is located, so as to obtain the spatial coordinate of each element on the target model, where the spatial coordinate is the position information. The target model can be restored according to the element information, and in the embodiment of the disclosure, a plurality of snapshots corresponding to the target model are generated according to the element information, and the snapshots can be understood as the imaging of the element information. And judging the category of the target visual angle according to the application scene, wherein the plurality of snapshots are pictures contained in the target visual angle.
In step 204, playing the target viewing angle according to a preset playing rule, including: acquiring the switching time length of each picture in a target visual angle and the playing time length of each picture; and playing the target visual angle according to the first accumulation time corresponding to each picture, the switching time of each picture, the second accumulation time corresponding to each picture and the playing time of each picture.
The switching duration of each picture is the duration required to switch from that picture to the next frame of picture. The playing time of each picture is the time required for playing one picture. A picture can be understood as a frame. The first accumulation time corresponding to each picture is the time of recording the switching process in the process of switching the picture to the picture of the next frame of the picture, for example, when one picture starts to be switched, the first accumulation time is zero, the switching process of one picture is just completed, and the first accumulation time is the switching time of the picture. The second accumulation time corresponding to each picture is the time of recording the playing process in the process of playing the picture, for example, when one picture starts to be played, the second accumulation time is zero, the playing of one picture is just completed, and the second accumulation time is the playing duration of the picture.
In step 204, according to the first accumulation time corresponding to each picture, the switching duration of each picture, the second accumulation time corresponding to each picture, the playing duration of each picture, the playing target viewing angle includes: according to the first accumulation time corresponding to the current frame picture and the switching time of the current frame picture, switching from the current frame picture to the next frame picture of the current frame picture is completed, and the method comprises the following steps: updating the first accumulation time corresponding to the current frame picture in real time, and starting to play the next frame picture of the current frame picture when the first accumulation time corresponding to the current frame picture is equal to the switching duration corresponding to the current frame picture.
When a picture starts to be switched, the first accumulation time is zero, the picture switching process is just completed, and the first accumulation time is the switching duration of the picture. If the first accumulation time is the switching time of the picture, the switching from the picture to the next frame of picture of the picture is completed, and after the switching is completed, the next frame of picture of the current frame of picture can be started to be played. If the first accumulation time is not equal to the switching duration, the switching is not completed, so that the picture is continuously switched until the first accumulation time is equal to the switching duration. For example, the current frame picture is picture a, the next frame picture of the current frame picture is picture B, and when the switching from picture a to picture B is completed, the playing of picture B is started.
In step 204, according to the first accumulation time corresponding to each picture, the switching duration of each picture, the second accumulation time corresponding to each picture, the playing duration of each picture, the playing target viewing angle includes: according to the second accumulation time corresponding to the current frame picture and the playing time length of the current frame picture, the playing of the current frame picture is completed, and the method comprises the following steps: and updating the second accumulation time corresponding to the current frame picture in real time, and starting switching from the current frame picture to the next frame picture of the current frame picture when the second accumulation time corresponding to the current frame picture is equal to the playing time corresponding to the current frame picture.
When one picture starts to be played, the second accumulation time is zero, the playing of the one picture is just completed, and the second accumulation time is the playing duration of the picture. And when the second accumulation time is equal to the playing duration, indicating that the playing of the picture is completed, and starting to switch the picture to the next frame of picture of the picture. And when the second accumulation time is not equal to the playing duration, indicating that the picture is not played, and continuing playing the picture. For example, the current frame picture is picture B, the next frame picture of the current frame picture is picture C, and when playing of picture B is completed, switching of picture B to picture C is started.
The present embodiment and the previous embodiment can be understood as one cycle in practice, for example, there are a picture a, a picture B, and a picture C. The next frame picture of picture a is picture B and the next frame picture of picture B is picture C. After the picture A is played, switching from the picture A to the picture B is started, and when the switching from the picture A to the picture B is completed, playing the picture B is started; when playing the picture B is completed, starting to switch the picture B to the picture C; and switching the picture B to the picture C to finish, starting to play the picture C … … if the picture C does not have the picture of the next frame, indicating that the target visual angle playing is finished, and continuing to switch the picture C to the picture of the next frame if the picture C exists.
After step 202 is performed, that is, a plurality of target perspectives are generated for the target model according to the application scenario, where the plurality of target perspectives includes: after the plurality of model perspectives and the plurality of path perspectives, the method further comprises: according to a preset series rule, connecting a plurality of target visual angles in series to form an animation corresponding to the target model; and playing the animation according to a preset playing rule.
And connecting the plurality of target visual angles in series according to a preset series rule to form an animation corresponding to the target model, wherein the plurality of target visual angles can be connected in series according to the clockwise rotation sequence of the target model or connected in series according to the anticlockwise rotation sequence of the target model.
Each target view is a target view at a different angle of the target model. According to a preset playing rule, playing the animation, namely sequentially playing each target view according to the sequence of connecting a plurality of target views in series, and playing each target view according to the preset playing rule.
Fig. 3 is a flow chart of a target view playing method according to an embodiment of the present disclosure, where the flow chart is shown in fig. 3:
S301: updating a second accumulation time corresponding to the current frame picture in the target visual angle in real time;
s302: judging whether the second accumulation time corresponding to the current frame picture is equal to the playing time corresponding to the current frame picture;
s303: if the second accumulation time corresponding to the current frame picture is equal to the playing time corresponding to the current frame picture, judging whether the current frame picture is the last frame picture in the target visual angle;
S304: if the current frame picture is not the last frame picture in the target visual angle, starting switching from the current frame picture to the next frame picture of the current frame picture;
S305: if the current frame picture is the last frame picture in the target view angle, stopping playing of the target view angle;
s306: if the second accumulation time corresponding to the current frame picture is not equal to the playing time length corresponding to the current frame picture, continuing to play the current frame picture;
S307: when switching from a current frame picture to a next frame picture of the current frame picture is started, updating a first accumulation time corresponding to the current frame picture in real time;
S308: judging whether the first accumulation time corresponding to the current frame picture is equal to the switching time corresponding to the current frame picture or not;
S309: if the first accumulation time corresponding to the current frame picture is equal to the switching time corresponding to the current frame picture, starting to play the next frame picture of the current frame picture;
s310: if the first accumulation time corresponding to the current frame picture is not equal to the switching duration corresponding to the current frame picture, continuously updating the first accumulation time corresponding to the current frame picture.
Any combination of the above optional solutions may be adopted to form an optional embodiment of the present application, which is not described herein.
The following are device embodiments of the present disclosure that may be used to perform method embodiments of the present disclosure. For details not disclosed in the embodiments of the apparatus of the present disclosure, please refer to the embodiments of the method of the present disclosure.
Fig. 3 is a schematic diagram of a target viewing angle playing device according to an embodiment of the disclosure. As shown in fig. 3, the target viewing angle playing device includes:
The acquisition module 401 is configured to acquire a target model and determine an application scene corresponding to the target model;
the generating module 402 is configured to generate a plurality of target perspectives for the target model according to the application scenario, where the plurality of target perspectives includes: a plurality of model perspectives and a plurality of path perspectives;
An association module 403 configured to generate a button tree corresponding to a plurality of target perspectives, and associate the plurality of target perspectives with the button tree, wherein the button tree comprises a plurality of buttons, and one target perspective corresponds to one button;
The playing module 404 is configured to play the plurality of target viewing angles according to a preset playing rule.
The target view angles of the target models to be presented are different in different application scenes, the categories of the target view angles of the target models to be presented are model view angles in some application scenes, and the categories of the target view angles of the target models to be presented are path view angles in some application scenes. Application scenes such as a product design scene, a product display scene and the like. A target view may be understood as a picture or a group of pictures. One model view may be one picture and one path view may be a view containing multiple pictures. For example, in a product design scenario, where a slow display of the product is required, multiple model perspectives should be used; in a product display scenario, a product can be displayed faster, and multiple path views should be used. Of course, no matter what scene, the model view angle and the path view angle can be combined and utilized, and the effect is better by combining the display of the model view angle and the path view angle. The product can be an object created by people or can be a natural resource. The button tree is a series of empty buttons which are not in operation link, and a plurality of target visual angles are associated with the buttons in the button tree, so that the display process of the plurality of target visual angles can be controlled through the buttons.
According to the technical scheme provided by the embodiment of the disclosure, the application scene corresponding to the target model is determined by acquiring the target model; generating a plurality of target view angles for the target model according to the application scene, wherein the plurality of target view angles comprise: a plurality of model perspectives and a plurality of path perspectives; generating a button tree corresponding to a plurality of target visual angles, and associating the plurality of target visual angles with the button tree, wherein the button tree comprises a plurality of buttons, and one button corresponds to one target visual angle; according to the preset playing rules, a plurality of target visual angles are played, so that the problems of complex and low efficiency of the presenting process of the view of the three-dimensional model in a multi-dimensional and omnibearing manner in the prior art can be solved by adopting the technical means, the process of presenting the view of the three-dimensional model is simplified, and the efficiency is improved.
Optionally, the generating module 402 is further configured to a plurality of model perspectives and a plurality of path perspectives, including: generating a path view angle for the target model according to the application scene comprises: acquiring pictures of a target model under a plurality of angles; and generating a storage path corresponding to the target model, and storing pictures of the target model under a plurality of angles into the storage path to generate a path view angle.
One model view may be one picture and one path view may be a view containing multiple pictures. Generating a plurality of target view angles for the target model according to the application scene, and judging the category of the target view angles according to the application scene, namely whether the target view angles generated for the target model are model view angles or path view angles or both. After judging the category of the target visual angle, acquiring pictures of the target model under a plurality of angles, and generating a model visual angle or a path visual angle according to the plurality of pictures.
If the path view angle is generated for the target model, the path view angle is obtained by generating a storage path corresponding to the target model, namely, a storage path corresponding to the path view angle, and then storing pictures of the target model under a plurality of angles into the storage path. If the path view is required to be edited, deleting or adding new pictures in a plurality of pictures in a storage path corresponding to the path view.
If the model view angle is generated for the target model, each picture of the acquired target model can be directly used as one model view angle.
Optionally, the generating module 402 is further configured to a plurality of model perspectives and a plurality of path perspectives, including: acquiring element information corresponding to the target model, wherein the element information comprises: a plurality of elements on the object model, and positional information of each element; generating a plurality of snapshots corresponding to the target model according to the element information; and generating a plurality of target visual angles for the target model according to the application scene and the plurality of snapshots.
In the above embodiment, the image of the object model under multiple angles is acquired by the image acquisition device, and the embodiment of the disclosure provides a method for directly generating multiple object views for the object model without the need of the image acquisition device to acquire the image of the object model. Acquiring element information corresponding to the target model, wherein one element in the element information can be a key point of the target model, such as an inflection point, a vertex and the like; the position information of each element may be a spatial coordinate system established for the space where the target model is located, so as to obtain the spatial coordinate of each element on the target model, where the spatial coordinate is the position information. The target model can be restored according to the element information, and in the embodiment of the disclosure, a plurality of snapshots corresponding to the target model are generated according to the element information, and the snapshots can be understood as the imaging of the element information. And judging the category of the target visual angle according to the application scene, wherein the plurality of snapshots are pictures contained in the target visual angle.
Optionally, the playing module 404 is further configured to obtain a switching duration of each picture in the target view and a playing duration of each picture; and playing the target visual angle according to the first accumulation time corresponding to each picture, the switching time of each picture, the second accumulation time corresponding to each picture and the playing time of each picture.
The switching duration of each picture is the duration required to switch from that picture to the next frame of picture. The playing time of each picture is the time required for playing one picture. A picture can be understood as a frame. The first accumulation time corresponding to each picture is the time of recording the switching process in the process of switching the picture to the picture of the next frame of the picture, for example, when one picture starts to be switched, the first accumulation time is zero, the switching process of one picture is just completed, and the first accumulation time is the switching time of the picture. The second accumulation time corresponding to each picture is the time of recording the playing process in the process of playing the picture, for example, when one picture starts to be played, the second accumulation time is zero, the playing of one picture is just completed, and the second accumulation time is the playing duration of the picture.
Optionally, the playing module 404 is further configured to complete switching from the current frame picture to the next frame picture of the current frame picture according to the first accumulation time corresponding to the current frame picture and the switching duration of the current frame picture, including: updating the first accumulation time corresponding to the current frame picture in real time, and starting to play the next frame picture of the current frame picture when the first accumulation time corresponding to the current frame picture is equal to the switching duration corresponding to the current frame picture.
When a picture starts to be switched, the first accumulation time is zero, the picture switching process is just completed, and the first accumulation time is the switching duration of the picture. If the first accumulation time is the switching time of the picture, the switching from the picture to the next frame of picture of the picture is completed, and after the switching is completed, the next frame of picture of the current frame of picture can be started to be played. If the first accumulation time is not equal to the switching duration, the switching is not completed, so that the picture is continuously switched until the first accumulation time is equal to the switching duration. For example, the current frame picture is picture a, the next frame picture of the current frame picture is picture B, and when the switching from picture a to picture B is completed, the playing of picture B is started.
Optionally, the playing module 404 is further configured to complete playing of the current frame picture according to the second accumulation time corresponding to the current frame picture and the playing duration of the current frame picture, including: and updating the second accumulation time corresponding to the current frame picture in real time, and starting switching from the current frame picture to the next frame picture of the current frame picture when the second accumulation time corresponding to the current frame picture is equal to the playing time corresponding to the current frame picture.
When one picture starts to be played, the second accumulation time is zero, the playing of the one picture is just completed, and the second accumulation time is the playing duration of the picture. And when the second accumulation time is equal to the playing duration, indicating that the playing of the picture is completed, and starting to switch the picture to the next frame of picture of the picture. And when the second accumulation time is not equal to the playing duration, indicating that the picture is not played, and continuing playing the picture. For example, the current frame picture is picture B, the next frame picture of the current frame picture is picture C, and when playing of picture B is completed, switching of picture B to picture C is started.
The present embodiment and the previous embodiment can be understood as one cycle in practice, for example, there are a picture a, a picture B, and a picture C. The next frame picture of picture a is picture B and the next frame picture of picture B is picture C. After the picture A is played, switching from the picture A to the picture B is started, and when the switching from the picture A to the picture B is completed, playing the picture B is started; when playing the picture B is completed, starting to switch the picture B to the picture C; and switching the picture B to the picture C to finish, starting to play the picture C … … if the picture C does not have the picture of the next frame, indicating that the target visual angle playing is finished, and continuing to switch the picture C to the picture of the next frame if the picture C exists.
Optionally, after the generating module 402 is further configured to the plurality of model perspectives and the plurality of path perspectives, the method further comprises: according to a preset series rule, connecting a plurality of target visual angles in series to form an animation corresponding to the target model; and playing the animation according to a preset playing rule.
And connecting the plurality of target visual angles in series according to a preset series rule to form an animation corresponding to the target model, wherein the plurality of target visual angles can be connected in series according to the clockwise rotation sequence of the target model or connected in series according to the anticlockwise rotation sequence of the target model.
Each target view is a target view at a different angle of the target model. According to a preset playing rule, playing the animation, namely sequentially playing each target view according to the sequence of connecting a plurality of target views in series, and playing each target view according to the preset playing rule.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic of each process, and should not constitute any limitation on the implementation process of the embodiments of the disclosure.
Fig. 5 is a schematic diagram of an electronic device 5 provided by an embodiment of the present disclosure. As shown in fig. 5, the electronic apparatus 5 of this embodiment includes: a processor 501, a memory 502 and a computer program 503 stored in the memory 502 and executable on the processor 501. The steps of the various method embodiments described above are implemented by processor 501 when executing computer program 503. Or the processor 501 when executing the computer program 503 performs the functions of the modules/units in the above-described apparatus embodiments.
Illustratively, the computer program 503 may be partitioned into one or more modules/units, which are stored in the memory 502 and executed by the processor 501 to complete the present disclosure. One or more of the modules/units may be a series of computer program instruction segments capable of performing a specific function for describing the execution of the computer program 503 in the electronic device 5.
The electronic device 5 may be a desktop computer, a notebook computer, a palm computer, a cloud server, or the like. The electronic device 5 may include, but is not limited to, a processor 501 and a memory 502. It will be appreciated by those skilled in the art that fig. 5 is merely an example of the electronic device 5 and is not meant to be limiting as the electronic device 5 may include more or fewer components than shown, or may combine certain components, or different components, e.g., the electronic device may further include an input-output device, a network access device, a bus, etc.
The Processor 501 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (DIGITAL SIGNAL Processor, DSP), application SPECIFIC INTEGRATED Circuit (ASIC), field-Programmable gate array (Field-Programmable GATE ARRAY, FPGA) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 502 may be an internal storage unit of the electronic device 5, for example, a hard disk or a memory of the electronic device 5. The memory 502 may also be an external storage device of the electronic device 5, such as a plug-in hard disk, a smart memory card (SMART MEDIA CARD, SMC), a Secure Digital (SD) card, a flash memory card (FLASH CARD) or the like, which are provided on the electronic device 5. Further, the memory 502 may also include both internal storage units and external storage devices of the electronic device 5. The memory 502 is used to store computer programs and other programs and data required by the electronic device. The memory 502 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
In the embodiments provided in the present disclosure, it should be understood that the disclosed apparatus/electronic device and method may be implemented in other manners. For example, the apparatus/electronic device embodiments described above are merely illustrative, e.g., the division of modules or elements is merely a logical functional division, and there may be additional divisions of actual implementations, multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present disclosure may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present disclosure may implement all or part of the flow of the method of the above-described embodiments, or may be implemented by a computer program to instruct related hardware, and the computer program may be stored in a computer readable storage medium, where the computer program, when executed by a processor, may implement the steps of the method embodiments described above. The computer program may comprise computer program code, which may be in source code form, object code form, executable file or in some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the content of the computer readable medium can be appropriately increased or decreased according to the requirements of the jurisdiction's jurisdiction and the patent practice, for example, in some jurisdictions, the computer readable medium does not include electrical carrier signals and telecommunication signals according to the jurisdiction and the patent practice.
The above embodiments are merely for illustrating the technical solution of the present disclosure, and are not limiting thereof; although the present disclosure has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the disclosure, and are intended to be included in the scope of the present disclosure.

Claims (10)

1. A target viewing angle playing method, comprising:
acquiring a target model and determining an application scene corresponding to the target model;
generating a plurality of target view angles for the target model according to the application scene, wherein the plurality of target view angles comprise: a plurality of model view angles and a plurality of path view angles, wherein the model view angle is a picture, and the path view angles comprise a plurality of pictures;
generating a button tree corresponding to the multiple target visual angles, and associating the multiple target visual angles with the button tree, wherein the button tree comprises multiple buttons, and one button corresponds to one target visual angle;
And playing the plurality of target visual angles according to a preset playing rule.
2. The method of claim 1, wherein the generating the plurality of target perspectives for the target model from the application scenario, wherein the plurality of target perspectives comprises: a plurality of model perspectives and a plurality of path perspectives, comprising:
generating the path view angle for the target model according to the application scene comprises the following steps:
acquiring pictures of the target model under a plurality of angles;
and generating a storage path corresponding to the target model, and storing pictures of the target model under a plurality of angles into the storage path to generate the path view angle.
3. The method of claim 1, wherein the generating the plurality of target perspectives for the target model from the application scenario, wherein the plurality of target perspectives comprises: a plurality of model perspectives and a plurality of path perspectives, comprising:
acquiring element information corresponding to the target model, wherein the element information comprises: a plurality of elements on the object model, and positional information of each of the elements;
generating a plurality of snapshots corresponding to the target model according to the element information;
and generating a plurality of target visual angles for the target model according to the application scene and the plurality of snapshots.
4. The method of claim 1, wherein playing the plurality of target perspectives according to a preset playing rule comprises:
acquiring the switching time length of each picture in the multiple target visual angles and the playing time length of each picture;
and playing the plurality of target visual angles according to the first accumulation time corresponding to each picture, the switching time of each picture, the second accumulation time corresponding to each picture and the playing time of each picture.
5. The method of claim 4, wherein playing the plurality of target views according to the first accumulation time corresponding to each picture, the switching duration of each picture, the second accumulation time corresponding to each picture, and the playing duration of each picture, comprises:
According to a first accumulation time corresponding to a current frame picture and a switching time length of the current frame picture, switching from the current frame picture to a next frame picture of the current frame picture is completed, and the method comprises the following steps:
Updating the first accumulation time corresponding to the current frame picture in real time, and starting to play the next frame picture of the current frame picture when the first accumulation time corresponding to the current frame picture is equal to the switching time corresponding to the current frame picture.
6. The method of claim 4, wherein playing the plurality of target views according to the first accumulation time corresponding to each picture, the switching duration of each picture, the second accumulation time corresponding to each picture, and the playing duration of each picture, comprises:
According to the second accumulation time corresponding to the current frame picture and the playing time length of the current frame picture, the playing of the current frame picture is completed, and the method comprises the following steps:
And updating the second accumulation time corresponding to the current frame picture in real time, and starting switching from the current frame picture to the next frame picture of the current frame picture when the second accumulation time corresponding to the current frame picture is equal to the playing time corresponding to the current frame picture.
7. The method of claim 1, wherein the generating the plurality of target perspectives for the target model from the application scenario, wherein the plurality of target perspectives comprises: after the plurality of model perspectives and the plurality of path perspectives, the method further comprises:
According to a preset series rule, connecting the multiple target visual angles in series to form an animation corresponding to the target model;
And playing the animation according to the preset playing rule.
8. A target viewing angle playing device, comprising:
The acquisition module is configured to acquire a target model and determine an application scene corresponding to the target model;
a generating module configured to generate a plurality of target perspectives for the target model according to the application scenario, wherein the plurality of target perspectives comprises: a plurality of model view angles and a plurality of path view angles, wherein the model view angle is a picture, and the path view angles comprise a plurality of pictures;
An association module configured to generate a button tree corresponding to the plurality of target perspectives, and associate the plurality of target perspectives with the button tree, wherein the button tree comprises a plurality of buttons, and one of the target perspectives corresponds to one of the buttons;
And the playing module is configured to play the plurality of target visual angles according to a preset playing rule.
9. An electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 7 when the computer program is executed.
10. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the method according to any one of claims 1 to 7.
CN202210134505.1A 2022-02-14 2022-02-14 Target visual angle playing method and device Active CN114463528B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210134505.1A CN114463528B (en) 2022-02-14 2022-02-14 Target visual angle playing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210134505.1A CN114463528B (en) 2022-02-14 2022-02-14 Target visual angle playing method and device

Publications (2)

Publication Number Publication Date
CN114463528A CN114463528A (en) 2022-05-10
CN114463528B true CN114463528B (en) 2024-07-16

Family

ID=81412725

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210134505.1A Active CN114463528B (en) 2022-02-14 2022-02-14 Target visual angle playing method and device

Country Status (1)

Country Link
CN (1) CN114463528B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201590213U (en) * 2009-09-24 2010-09-22 张文中 Three-dimensional model automatic display system

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105142017B (en) * 2015-08-12 2019-01-22 北京金山安全软件有限公司 Picture switching method and picture switching device during picture video playing
CN108376424A (en) * 2018-02-09 2018-08-07 腾讯科技(深圳)有限公司 Method, apparatus, equipment and storage medium for carrying out view angle switch to three-dimensional virtual environment
CN109242978B (en) * 2018-08-21 2023-07-07 百度在线网络技术(北京)有限公司 Viewing angle adjusting method and device for three-dimensional model

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201590213U (en) * 2009-09-24 2010-09-22 张文中 Three-dimensional model automatic display system

Also Published As

Publication number Publication date
CN114463528A (en) 2022-05-10

Similar Documents

Publication Publication Date Title
US12001478B2 (en) Video-based interaction implementation method and apparatus, device and medium
CN113470092B (en) Terrain rendering method and device, electronic equipment and storage medium
EP3299935A1 (en) Method for providing demonstration information in simulation environment, and associated simulation system
CN113867593B (en) Interaction method, device, electronic equipment and storage medium
EP4191513A1 (en) Image processing method and apparatus, device and storage medium
US20230131399A1 (en) Data processing method and apparatus, and readable medium and electronic device
JP2023548485A (en) Information display method, device, electronic device and computer readable storage medium
CN111381967A (en) Virtual object processing method and device
CN114697703A (en) Video data generation method and device, electronic equipment and storage medium
US20240073488A1 (en) Live video processing method and apparatus, device and medium
CN114463528B (en) Target visual angle playing method and device
CN110457106B (en) Information display method, device, equipment and storage medium
CN110288523B (en) Image generation method and device
CN112148744A (en) Page display method and device, electronic equipment and computer readable medium
CN116228952A (en) Virtual object mounting method, device, equipment and medium
CN112492399A (en) Information display method and device and electronic equipment
CN113031846B (en) Method and device for displaying description information of task and electronic equipment
CN114741193A (en) Scene rendering method and device, computer readable medium and electronic equipment
CN116527993A (en) Video processing method, apparatus, electronic device, storage medium and program product
CN110166825B (en) Video data processing method and device and video playing method and device
CN107659830B (en) Interactive activity access method, system and terminal equipment
CN115268740B (en) Virtual character extraction method, device, electronic equipment and readable storage medium
US20240269553A1 (en) Method, apparatus, electronic device and storage medium for extending reality display
EP4274237A1 (en) Information display method and apparatus, and device and medium
CN112306222B (en) Augmented reality method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant