CN116363337B - Model tour method and device and electronic equipment - Google Patents
Model tour method and device and electronic equipment Download PDFInfo
- Publication number
- CN116363337B CN116363337B CN202310356304.0A CN202310356304A CN116363337B CN 116363337 B CN116363337 B CN 116363337B CN 202310356304 A CN202310356304 A CN 202310356304A CN 116363337 B CN116363337 B CN 116363337B
- Authority
- CN
- China
- Prior art keywords
- distance
- virtual camera
- determining
- sphere
- target model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 48
- 230000004044 response Effects 0.000 claims abstract description 40
- 238000006073 displacement reaction Methods 0.000 claims abstract description 13
- 238000004590 computer program Methods 0.000 claims description 13
- 238000013459 approach Methods 0.000 claims description 5
- 230000007704 transition Effects 0.000 abstract description 6
- 238000010586 diagram Methods 0.000 description 9
- 230000000694 effects Effects 0.000 description 6
- 230000008859 change Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 230000008901 benefit Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000005094 computer simulation Methods 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000006798 recombination Effects 0.000 description 1
- 238000005215 recombination Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000005303 weighing Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2016—Rotation, translation, scaling
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Architecture (AREA)
- Processing Or Creating Images (AREA)
Abstract
The embodiment of the disclosure discloses a model tour method, a device and electronic equipment, wherein the method comprises the following steps: in response to receiving an instruction for performing model tour on a target model, determining an initial position of a virtual camera based on at least one observation point position corresponding to the target model; moving the virtual camera according to the received displacement instruction, and determining a first distance between the virtual camera and a center point of the target model; determining whether the virtual camera is in a buffer based on the first distance; determining a second distance between the virtual camera and the rotation axis of the target model according to the first distance in response to the virtual camera being in the buffer; the embodiment enables the virtual camera to realize smooth transition when entering the target model from outside the target model or from inside the target model to outside the target model by setting the buffer zone.
Description
Technical Field
The disclosure relates to internet technology, in particular to a model tour method, a model tour device and electronic equipment.
Background
Virtual Roaming (VR) is a high-tech technology that has appeared in recent years, and Virtual reality is to generate a Virtual world in three-dimensional space by computer simulation, so as to provide a sense simulation of vision, hearing, touch and the like for a user, and enable the user to observe things in the three-dimensional space in time and without limitation as if the user were in the scene. The system uses the three-dimensional animation/3 Dmax, virtual reality VRP, large screen display, man-machine interaction and other leading edge technologies, and the audience can actively and autonomously roam in the three-dimensional model scene through the operating lever during virtual roaming, so that the internal structure of the three-dimensional model scene is known.
Disclosure of Invention
The present disclosure has been made in order to solve the above technical problems. The embodiment of the disclosure provides a model tour method, a model tour device and electronic equipment.
According to an aspect of the embodiments of the present disclosure, there is provided a model tour method including:
In response to receiving an instruction for performing model tour on a target model, determining an initial position of a virtual camera based on at least one observation point position corresponding to the target model;
Moving the virtual camera according to the received displacement instruction, and determining a first distance between the virtual camera and a center point of the target model;
determining whether the virtual camera is in a buffer based on the first distance;
And in response to the virtual camera being in the buffer, determining a second distance between the virtual camera and the rotation axis of the target model according to the first distance.
Optionally, before determining whether the virtual camera is in the buffer based on the first distance, further comprising:
determining the center point coordinates of the target model based on all vertex coordinates of the target model in a model coordinate system;
Determining a first sphere corresponding to the target model based on the center point coordinates and all the vertex information;
Determining the buffer zone based on the radius of the first sphere and a preset buffer ratio; wherein the value of the preset buffer ratio is between 0 and 1.
Optionally, the determining, based on the coordinates of the center point and the information of all vertices, a first sphere corresponding to the target model includes:
determining maximum length information of the target model on three coordinate axes of the model coordinate system based on all the vertex coordinates;
and determining the radius of the first sphere based on the center point coordinates and the maximum length information on the three coordinate axes, and determining the first sphere taking the center point coordinates as the sphere center.
Optionally, the determining the buffer area based on the radius of the first sphere and a preset buffer ratio includes:
determining the radius of a second sphere based on the preset buffer ratio and the radius of the first sphere;
determining the second sphere with the center point coordinates as a sphere center based on the radius of the second sphere;
Determining the area between the second sphere and the first sphere as the buffer area.
Optionally, the determining whether the virtual camera is in a buffer based on the first distance includes:
determining whether the first distance is greater than a radius of the second sphere and less than a radius of the first sphere;
responsive to the first distance being greater than a radius of the second sphere and less than a radius of the first sphere, determining that the virtual camera is in the buffer.
Optionally, the method further comprises:
And in response to the virtual camera not being in a buffer, determining that the second distance is the first distance or zero.
Optionally, the determining that the second distance is the first distance or zero includes:
Determining the second distance as the first distance in response to the first distance being greater than a radius of the first sphere;
In response to the first distance being less than a radius of the second sphere, the second distance is determined to be zero.
Optionally, the method further comprises:
in response to a received rotation instruction, the target model is determined to rotate around the virtual camera according to the rotation instruction or around the rotation axis according to the rotation instruction according to the second distance.
According to another aspect of the disclosed embodiments, there is provided a model tour device including:
The virtual camera determining module is used for determining the initial position of the virtual camera based on at least one observation point position corresponding to the target model in response to receiving an instruction for performing model tour on the target model;
the first distance module is used for moving the virtual camera according to the received displacement instruction and determining a first distance between the virtual camera and the center point of the target model;
the buffer zone judging module is used for determining whether the virtual camera is in a buffer zone or not based on the first distance;
and the second distance module is used for responding to the fact that the virtual camera is in the buffer zone, and determining a second distance between the virtual camera and the rotating shaft of the target model according to the first distance.
Optionally, the apparatus further comprises:
The center point module is used for determining the center point coordinates of the target model based on all the vertex coordinates of the target model in a model coordinate system;
The first sphere module is used for determining a first sphere corresponding to the target model based on the center point coordinates and all the vertex information;
The buffer zone determining module is used for determining the buffer zone based on the radius of the first sphere and a preset buffer proportion; wherein the value of the preset buffer ratio is between 0 and 1.
Optionally, the first sphere module is specifically configured to determine maximum length information of the target model on three coordinate axes of the model coordinate system based on all vertex coordinates; and determining the radius of the first sphere based on the center point coordinates and the maximum length information on the three coordinate axes, and determining the first sphere taking the center point coordinates as the sphere center.
Optionally, the buffer determining module is specifically configured to determine a radius of a second sphere based on the preset buffer ratio and the radius of the first sphere; determining the second sphere with the center point coordinates as a sphere center based on the radius of the second sphere; determining the area between the second sphere and the first sphere as the buffer area.
Optionally, the buffer zone determining module is specifically configured to determine whether the first distance is greater than a radius of the second sphere and less than a radius of the first sphere; responsive to the first distance being greater than a radius of the second sphere and less than a radius of the first sphere, determining that the virtual camera is in the buffer.
Optionally, the second distance module is further configured to determine, in response to the virtual camera not being in a buffer, that the second distance is the first distance or zero.
Optionally, the second distance module is configured to determine, when determining that the second distance is the first distance or zero, that the second distance is the first distance in response to the first distance being greater than a radius of the first sphere; in response to the first distance being less than a radius of the second sphere, the second distance is determined to be zero.
Optionally, the apparatus further comprises:
and the model rotating module is used for responding to the received rotating instruction, determining that the target model rotates around the virtual camera according to the rotating instruction or rotates around the rotating shaft according to the rotating instruction according to the second distance.
According to still another aspect of the embodiments of the present disclosure, there is provided an electronic device including:
A memory for storing a computer program product;
A processor configured to execute the computer program product stored in the memory, and when executed, implement the model tour method according to any of the above embodiments.
According to a further aspect of the disclosed embodiments, there is provided a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the model tour method according to any of the above embodiments.
According to a further aspect of the disclosed embodiments, there is provided a computer program product comprising computer program instructions which, when executed by a processor, implement the model tour method according to any of the embodiments described above.
Based on the model tour method, the device and the electronic equipment provided by the embodiment of the disclosure, in response to receiving an instruction for performing model tour on a target model, determining an initial position of a virtual camera based on at least one observation point corresponding to the target model; moving the virtual camera according to the received displacement instruction, and determining a first distance between the virtual camera and a center point of the target model; determining whether the virtual camera is in a buffer based on the first distance; determining a second distance between the virtual camera and the rotation axis of the target model according to the first distance in response to the virtual camera being in the buffer; according to the embodiment, the buffer area is arranged, so that the virtual camera can realize smooth transition when entering the target model from outside the target model or when entering the target model from inside the target model to outside the target model, and can be freely switched between the first-person viewing angle and the third-person viewing angle, and the situation that the viewing angle changes severely due to sudden change of the viewing angle is avoided, so that the target model can be freely moved and checked is realized.
The technical scheme of the present disclosure is described in further detail below through the accompanying drawings and examples.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description, serve to explain the principles of the disclosure.
The disclosure may be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings in which:
FIG. 1 is a flow diagram of a model tour method provided by an exemplary embodiment of the present disclosure;
FIG. 2 is a flow diagram of a model tour method provided by another exemplary embodiment of the present disclosure;
FIG. 3 is a schematic diagram of an exemplary buffer area in a model tour method according to an exemplary embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a model tour device according to an exemplary embodiment of the present disclosure;
Fig. 5 illustrates a block diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, example embodiments according to the present disclosure will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present disclosure and not all of the embodiments of the present disclosure, and that the present disclosure is not limited by the example embodiments described herein.
It should be noted that: the relative arrangement of the components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless it is specifically stated otherwise.
It will be appreciated by those of skill in the art that the terms "first," "second," etc. in embodiments of the present disclosure are used merely to distinguish between different steps, devices or modules, etc., and do not represent any particular technical meaning nor necessarily logical order between them.
It should also be understood that in embodiments of the present disclosure, "plurality" may refer to two or more, and "at least one" may refer to one, two or more.
It should also be appreciated that any component, data, or structure referred to in the presently disclosed embodiments may be generally understood as one or more without explicit limitation or the contrary in the context.
In addition, the term "and/or" in this disclosure is merely an association relationship describing an association object, and indicates that three relationships may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" in the present disclosure generally indicates that the front and rear association objects are an or relationship. The data referred to in this disclosure may include unstructured data, such as text, images, video, and the like, as well as structured data.
It should also be understood that the description of the various embodiments of the present disclosure emphasizes the differences between the various embodiments, and that the same or similar features may be referred to each other, and for brevity, will not be described in detail.
Meanwhile, it should be understood that the sizes of the respective parts shown in the drawings are not drawn in actual scale for convenience of description.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses.
Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail, but are intended to be part of the specification where appropriate.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
Embodiments of the present disclosure may be applicable to electronic devices such as terminal devices, computer systems, servers, etc., which may operate with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known terminal devices, computing systems, environments, and/or configurations that may be suitable for use with the terminal device, computer system, server, or other electronic device include, but are not limited to: personal computer systems, server computer systems, thin clients, thick clients, hand-held or laptop devices, microprocessor-based systems, set-top boxes, programmable consumer electronics, network personal computers, minicomputer systems, mainframe computer systems, and distributed cloud computing technology environments that include any of the above systems, and the like.
Electronic devices such as terminal devices, computer systems, servers, etc. may be described in the general context of computer system-executable instructions, such as program modules, being executed by a computer system. Generally, program modules may include routines, programs, objects, components, logic, data structures, etc., that perform particular tasks or implement particular abstract data types. The computer system/server may be implemented in a distributed cloud computing environment in which tasks are performed by remote processing devices that are linked through a communications network. In a distributed cloud computing environment, program modules may be located in both local and remote computing system storage media including memory storage devices.
Summary of the application
In the process of realizing the present disclosure, the inventor finds that there are two technical implementation schemes capable of implementing VR roaming at present, one of which only supports free movement in the VR with the current position as the center, in the first person viewing angle, and in the front-back-left-right direction, and cannot move to the outside of the VR, and views the state of the whole VR model in a bird's eye view manner of a third person; the other model can be freely checked outside the VR model only by supporting the user from the view angle of the third person, and can enter the model to be checked in the model in a wandering way, but the view angle of the third person is always the view angle, so that the line of sight is far away, when the model is wandered in the model, once the model is rotated, the line of sight of the view angle is changed very severely, and the model is similar to a sharp turn and a large swing in perception.
Exemplary method
Fig. 1 is a flow diagram of a model tour method according to an exemplary embodiment of the present disclosure. The embodiment can be applied to an electronic device, as shown in fig. 1, and includes the following steps:
step 102, in response to receiving an instruction for performing model tour on the target model, determining an initial position of the virtual camera based on at least one observation point position corresponding to the target model.
The target model can be any three-dimensional model, for example, a three-dimensional model of a space such as a house, a vehicle and the like; at least one first-person observation point inside the model corresponds to at least one observation point, for example at least one third-person observation point outside the model, when the target model is built; the observation of the whole model outside the model can be realized by weighing the observation point by a third person, the virtual camera rotates around the rotation axis of the target model no matter how the user operates, and the relative position of the target model and the virtual camera is only considered to be adjusted when the target model is moved, but the rotation axis position is not changed; the model can be observed in a free way in the model by the aid of the first-person observation point, at the moment, the rotation axis of the target model coincides with the virtual camera, the virtual camera rotates around the rotation axis to rotate around the virtual camera, the effect of the model is equivalent to the effect of simulating human eyes, and the surroundings are observed freely by the first person.
Step 104, moving the virtual camera according to the received displacement instruction, and determining a first distance between the virtual camera and the center point of the target model.
In one embodiment, during the model tour, the displacement and rotation may not be performed simultaneously, and no other operations are performed when the virtual camera displacement is performed.
Step 106, determining whether the virtual camera is in the buffer based on the first distance.
In this embodiment, the problem of drastic change of the line of sight caused by viewing angle conversion during model tour is solved by providing a buffer area between the external area of the target model corresponding to the third person viewing angle and the internal area of the target model corresponding to the first person viewing angle.
Step 108, responsive to the virtual camera being in the buffer, determining a second distance between the virtual camera and the axis of rotation of the object model from the first distance.
Optionally, when the virtual camera is in the buffer zone, the current virtual camera is described as not corresponding to the third person viewing angle outside the target model, nor corresponding to the first person viewing angle inside the model, at this time, a second distance between the virtual camera and the rotation axis of the target model is determined based on the first distance, and based on the second distance, a phenomenon of sharp turning or large head flicking does not occur when the rotation instruction is received at the position, and smooth viewing angle transition is realized.
In the model tour method provided by the embodiment of the disclosure, in response to receiving an instruction for performing model tour on a target model, determining an initial position of a virtual camera based on at least one observation point corresponding to the target model; moving the virtual camera according to the received displacement instruction, and determining a first distance between the virtual camera and a center point of the target model; determining whether the virtual camera is in a buffer based on the first distance; determining a second distance between the virtual camera and the rotation axis of the target model according to the first distance in response to the virtual camera being in the buffer; according to the embodiment, the buffer area is arranged, so that the virtual camera can realize smooth transition when entering the target model from outside the target model or when entering the target model from inside the target model to outside the target model, and can be freely switched between the first-person viewing angle and the third-person viewing angle, and the situation that the viewing angle changes severely due to sudden change of the viewing angle is avoided, so that the target model can be freely moved and checked is realized.
As shown in fig. 2, on the basis of the embodiment shown in fig. 1, before step 106, the method may further include:
In step 201, the coordinates of the center point of the target model are determined based on all the vertex coordinates of the target model in the model coordinate system.
When the target model is generated, the coordinates of each vertex in the model are recorded, so that all vertex coordinates of the target model can be directly obtained based on the target model, the maximum and minimum coordinate values of all vertices on three direction axes of xyz in a model coordinate system are calculated to obtain model lengths lx, ly and lz in three directions, and the coordinates of a center point can be obtained based on the model lengths in the three directions, wherein the coordinates of the center point are the coordinates of the center point of the target model; in this embodiment, the model coordinate system is a coordinate system established for a corresponding target model, for example, an arbitrary vertex of the target model is taken as an origin, and three directions perpendicular to each other are determined by three sides of a plurality of sides connected to the origin in the target model.
Step 202, determining a first sphere corresponding to the target model based on the coordinates of the central point and all vertex information.
After the center point coordinates are determined, a first sphere may be determined based on the center point coordinates as a center of sphere, optionally the first sphere is an outer sphere.
In step 203, a buffer is determined based on the radius of the first sphere and a preset buffer ratio.
Wherein, the value of the preset buffer ratio is between 0 and 1.
In this embodiment, since the three-dimensional models are different in size, in order to adapt to models of different sizes, the buffer area is set to a preset buffer ratio of the radius of the outer sphere of the model, and the preset buffer ratio can be set according to the actual application scenario, for example, 0.4, 0.5, 0.6, etc. When the virtual camera gradually approaches the target model from the outside of the target model, if the constant distance between the virtual camera and the rotating shaft is always kept, a larger rotating angle is generated, but if the distance is changed to 0 immediately, a larger jumping feel and inconsistent experience are generated, and in order to weaken the discomfort, a buffer area is provided in the embodiment; when the virtual camera is in the buffer, the rotation axis approaches the virtual camera with an equal ratio of distance until the rotation axis is at a distance of 0 from the virtual camera.
Optionally, step 202 may include:
determining maximum length information of the target model on three coordinate axes of a model coordinate system based on all vertex coordinates;
and determining the radius of the first sphere based on the center point coordinate and the maximum length information on the three coordinate axes, and determining the first sphere taking the center point coordinate as the sphere center.
In this embodiment, after knowing the coordinates of all vertices, the length information of each side in the model can be determined based on the difference between the maximum value and the minimum value of all vertices on each coordinate axis, so as to determine the maximum length information on three coordinate axes, for example, the maximum length information is expressed as lx, ly, lz, and at this time, the radius of the outer sphere can be determined based on the square root of the sum of squares of lx/2, ly/2, lz/2; the unique first sphere can be determined on the premise of knowing the sphere center and the radius.
Optionally, step 203 may include:
determining the radius of the second sphere based on a preset buffer ratio and the radius of the first sphere;
alternatively, the radius of the second sphere may be determined based on the following formula (1):
r2=r1-k r1 formula (1)
Wherein r1 represents the radius of the first sphere; r2 represents the radius of the second sphere; k represents a preset buffer ratio representing the ratio of the distance from the second sphere surface to the first sphere surface to the radius of the first sphere.
Determining a second sphere with the center point coordinates as the center of the sphere based on the radius of the second sphere;
the area between the second sphere and the first sphere is determined to be a buffer.
Alternatively, the second sphere corresponds to the same sphere center as the first sphere, except that the second sphere radius is smaller than the first sphere radius, and thus there is a section of intersection area outside the second sphere and inside the first sphere, which is determined as a buffer area in this embodiment, for example, as shown in fig. 3, the area between the first sphere and the second sphere is a buffer area, and in the buffer area, the following binary once equation can be fitted as equation (2) for calculating the second distance:
D2 =a×r 1-D1) +b formula (2)
Wherein D1 represents a first distance between the virtual camera and a center point of the target model; d2 represents a second distance between the rotation axis and the virtual camera; r1 represents the radius of the first sphere; a. b is a constant parameter.
Optionally, at the edge of the first sphere, the distance between the rotation axis and the virtual camera is the radius of the first sphere, and inside the second sphere, the distance between the rotation axis and the virtual camera is 0, based on the two conditions, the binary first-order equation can be solved, and the values of a and b are determined; for example, a= -r 1/(1-k) r1; b=r1. In order to avoid abrupt change of the distance, the second distance between the virtual camera and the rotation axis of the target model may be determined by the above formula, so as to realize smooth transition of the distance between the virtual camera and the rotation axis.
Optionally, step 106 may include:
Determining whether the first distance is greater than a radius of the second sphere and less than a radius of the first sphere;
In response to the first distance being greater than the radius of the second sphere and less than the radius of the first sphere, determining that the virtual camera is in the buffer.
Based on the above embodiments, the buffer is between the first sphere and the second sphere, so that the virtual camera can be determined to be inside the buffer only if the first distance is recognized to be greater than the radius of the second sphere and less than the radius of the first sphere, otherwise the virtual camera is outside or inside the target model.
In some optional embodiments, the method provided in this embodiment may further include:
In response to the virtual camera not being in the buffer, the second distance is determined to be the first distance or zero.
Optionally, determining the second distance as the first distance or zero includes:
determining the second distance as the first distance in response to the first distance being greater than the radius of the first sphere; in response to the first distance being less than the radius of the second sphere, the second distance is determined to be zero.
When the virtual camera is not in the buffer zone, the virtual camera may be out of the target model or in the target model, in the embodiment, when the virtual camera is out of the target model, the second distance is determined by the first distance, so that the third person can visit the viewing angle, and when a rotation instruction is received, the virtual camera rotates around the rotation axis; when the virtual camera is in the target model, the first distance is determined to be zero (namely, the virtual camera coincides with the rotation axis), and when a rotation instruction is received, the virtual camera rotates around the rotation axis, which is equivalent to self rotation, so that the sightseeing of the first person viewing angle is realized.
In addition, in order to ensure the smoothness of the user operation, the present embodiment recalculates the second distance between the rotation axis and the virtual camera only when the user stops the current operation (for example, after completion of one displacement, etc.).
In some optional embodiments, the method provided in this embodiment may further include:
In response to the received rotation instruction, the target model is determined to rotate around the virtual camera according to the rotation instruction or around the rotation axis according to the rotation instruction according to the second distance.
In this embodiment, a rotation mode of determining the target model based on the distance between the virtual camera and the rotation axis is realized, specifically, when the second distance is 0, the virtual camera rotates around the rotation axis to be equal to rotating around itself, and at this time, the effect of browsing the model is equivalent to simulating the effect of human eyes, and the surrounding is freely observed by the first person; when the second distance is the first distance, the virtual camera rotates around the rotating shaft; when the virtual camera is in the buffer area, the second distance is determined based on the first distance, at this time, the rotating shaft approaches to the virtual camera with an equal ratio of distance, at this time, the rotating command is received, and the virtual camera is also enabled to rotate around the rotating shaft, but because the second distance is smaller than the first distance, phenomena such as sharp turning or large head swing cannot occur.
Any of the model tour methods provided by the embodiments of the present disclosure may be performed by any suitable device having data processing capabilities, including, but not limited to: terminal equipment, servers, etc. Or any of the model tour methods provided by the embodiments of the present disclosure may be executed by a processor, such as the processor executing any of the model tour methods mentioned by the embodiments of the present disclosure by invoking corresponding instructions stored in a memory. And will not be described in detail below.
Exemplary apparatus
Fig. 4 is a schematic structural view of a model tour device according to an exemplary embodiment of the present disclosure. As shown in fig. 4, the apparatus provided in this embodiment includes:
the virtual camera determining module 41 is configured to determine an initial position of the virtual camera based on at least one observation point position corresponding to the target model in response to receiving an instruction to perform model tour on the target model.
The first distance module 42 is configured to move the virtual camera according to the received displacement instruction, and determine a first distance between the virtual camera and a center point of the target model.
The buffer determination module 43 is configured to determine whether the virtual camera is in the buffer based on the first distance.
A second distance module 44 for determining a second distance between the virtual camera and the axis of rotation of the object model based on the first distance in response to the virtual camera being in the buffer.
In response to receiving an instruction for performing model tour on a target model, the model tour device provided by the embodiment of the disclosure determines an initial position of a virtual camera based on at least one observation point corresponding to the target model; moving the virtual camera according to the received displacement instruction, and determining a first distance between the virtual camera and a center point of the target model; determining whether the virtual camera is in a buffer based on the first distance; determining a second distance between the virtual camera and the rotation axis of the target model according to the first distance in response to the virtual camera being in the buffer; according to the embodiment, the buffer area is arranged, so that the virtual camera can realize smooth transition when entering the target model from outside the target model or when entering the target model from inside the target model to outside the target model, and can be freely switched between the first-person viewing angle and the third-person viewing angle, and the situation that the viewing angle changes severely due to sudden change of the viewing angle is avoided, so that the target model can be freely moved and checked is realized.
In some optional embodiments, the apparatus provided in this embodiment further includes:
The center point module is used for determining the center point coordinates of the target model based on all the vertex coordinates of the target model in a model coordinate system;
The first sphere module is used for determining a first sphere corresponding to the target model based on the center point coordinates and all the vertex information;
The buffer zone determining module is used for determining the buffer zone based on the radius of the first sphere and a preset buffer proportion; wherein the value of the preset buffer ratio is between 0 and 1.
Optionally, the first sphere module is specifically configured to determine maximum length information of the target model on three coordinate axes of the model coordinate system based on all vertex coordinates; and determining the radius of the first sphere based on the center point coordinates and the maximum length information on the three coordinate axes, and determining the first sphere taking the center point coordinates as the sphere center.
Optionally, the buffer determining module is specifically configured to determine a radius of a second sphere based on the preset buffer ratio and the radius of the first sphere; determining the second sphere with the center point coordinates as a sphere center based on the radius of the second sphere; determining the area between the second sphere and the first sphere as the buffer area.
Optionally, the buffer zone determining module is specifically configured to determine whether the first distance is greater than a radius of the second sphere and less than a radius of the first sphere; responsive to the first distance being greater than a radius of the second sphere and less than a radius of the first sphere, determining that the virtual camera is in the buffer.
Optionally, the second distance module is further configured to determine, in response to the virtual camera not being in a buffer, that the second distance is the first distance or zero.
Optionally, the second distance module is configured to determine, when determining that the second distance is the first distance or zero, that the second distance is the first distance in response to the first distance being greater than a radius of the first sphere; in response to the first distance being less than a radius of the second sphere, the second distance is determined to be zero.
In some optional embodiments, the apparatus provided in this embodiment further includes:
and the model rotating module is used for responding to the received rotating instruction, determining that the target model rotates around the virtual camera according to the rotating instruction or rotates around the rotating shaft according to the rotating instruction according to the second distance.
Exemplary electronic device
Next, an electronic device according to an embodiment of the present disclosure is described with reference to fig. 5. The electronic device may be either or both of the first device and the second device, or a stand-alone device independent thereof, which may communicate with the first device and the second device to receive the acquired input signals therefrom.
Fig. 5 illustrates a block diagram of an electronic device according to an embodiment of the present disclosure.
As shown in fig. 5, the electronic device includes one or more processors and memory.
The processor may be a Central Processing Unit (CPU) or other form of processing unit having data processing and/or instruction execution capabilities, and may control other components in the electronic device to perform the desired functions.
The memory may store one or more computer program products, which may include various forms of computer-readable storage media, such as volatile memory and/or nonvolatile memory. The volatile memory may include, for example, random Access Memory (RAM) and/or cache memory (cache), and the like. The non-volatile memory may include, for example, read Only Memory (ROM), hard disk, flash memory, and the like. One or more computer program products may be stored on the computer readable storage medium that can be run by a processor to implement the model tour methods and/or other desired functions of the various embodiments of the present disclosure described above.
In one example, the electronic device may further include: input devices and output devices, which are interconnected by a bus system and/or other forms of connection mechanisms (not shown).
In addition, the input device may include, for example, a keyboard, a mouse, and the like.
The output device may output various information including the determined distance information, direction information, etc., to the outside. The output device may include, for example, a display, speakers, a printer, and a communication network and remote output devices connected thereto, etc.
Of course, only some of the components of the electronic device relevant to the present disclosure are shown in fig. 5 for simplicity, components such as buses, input/output interfaces, etc. being omitted. In addition, the electronic device may include any other suitable components depending on the particular application.
In addition to the methods and apparatus described above, embodiments of the present disclosure may also be a computer program product comprising computer program instructions which, when executed by a processor, cause the processor to perform the steps in a model tour method according to various embodiments of the present disclosure described in the above section of the present description.
The computer program product may write program code for performing the operations of embodiments of the present disclosure in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present disclosure may also be a computer-readable storage medium, having stored thereon computer program instructions, which when executed by a processor, cause the processor to perform the steps in a model tour method according to various embodiments of the present disclosure described in the above section of the present description.
The computer readable storage medium may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium may include, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The basic principles of the present disclosure have been described above in connection with specific embodiments, but it should be noted that the advantages, benefits, effects, etc. mentioned in the present disclosure are merely examples and not limiting, and these advantages, benefits, effects, etc. are not to be considered as necessarily possessed by the various embodiments of the present disclosure. Furthermore, the specific details disclosed herein are for purposes of illustration and understanding only, and are not intended to be limiting, since the disclosure is not necessarily limited to practice with the specific details described.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different manner from other embodiments, so that the same or similar parts between the embodiments are mutually referred to. For system embodiments, the description is relatively simple as it essentially corresponds to method embodiments, and reference should be made to the description of method embodiments for relevant points.
The block diagrams of the devices, apparatuses, devices, systems referred to in this disclosure are merely illustrative examples and are not intended to require or imply that the connections, arrangements, configurations must be made in the manner shown in the block diagrams. As will be appreciated by one of skill in the art, the devices, apparatuses, devices, systems may be connected, arranged, configured in any manner. Words such as "including," "comprising," "having," and the like are words of openness and mean "including but not limited to," and are used interchangeably therewith. The terms "or" and "as used herein refer to and are used interchangeably with the term" and/or "unless the context clearly indicates otherwise. The term "such as" as used herein refers to, and is used interchangeably with, the phrase "such as, but not limited to.
The methods and apparatus of the present disclosure may be implemented in a number of ways. For example, the methods and apparatus of the present disclosure may be implemented by software, hardware, firmware, or any combination of software, hardware, firmware. The above-described sequence of steps for the method is for illustration only, and the steps of the method of the present disclosure are not limited to the sequence specifically described above unless specifically stated otherwise. Furthermore, in some embodiments, the present disclosure may also be implemented as programs recorded in a recording medium, the programs including machine-readable instructions for implementing the methods according to the present disclosure. Thus, the present disclosure also covers a recording medium storing a program for executing the method according to the present disclosure.
It is also noted that in the apparatus, devices and methods of the present disclosure, components or steps may be disassembled and/or assembled. Such decomposition and/or recombination should be considered equivalent to the present disclosure.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present disclosure. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the disclosure. Thus, the present disclosure is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, this description is not intended to limit the embodiments of the disclosure to the form disclosed herein. Although a number of example aspects and embodiments have been discussed above, a person of ordinary skill in the art will recognize certain variations, modifications, alterations, additions, and subcombinations thereof.
Claims (8)
1. A model tour method comprising:
In response to receiving an instruction for performing model tour on a target model, determining an initial position of a virtual camera based on at least one observation point position corresponding to the target model;
Moving the virtual camera according to the received displacement instruction, and determining a first distance between the virtual camera and a center point of the target model;
determining whether the virtual camera is in a buffer based on the first distance;
Determining a second distance between the virtual camera and the rotation axis of the target model according to the first distance in response to the virtual camera being in the buffer; determining the second distance as the first distance in response to the first distance being greater than a radius of a first sphere; determining that the second distance is zero in response to the first distance being less than a radius of a second sphere;
In response to a received rotation instruction, determining that the target model rotates about the virtual camera according to the rotation instruction or rotates about the rotation axis according to the rotation instruction according to the second distance; comprising the following steps: when the second distance is 0, the virtual camera rotates around the virtual camera; when the second distance is the first distance, the virtual camera rotates around the rotating shaft; when the virtual camera is in the buffer zone, the rotating shaft approaches the virtual camera with an equal ratio of distance until the distance between the rotating shaft and the virtual camera is 0.
2. The method of claim 1, wherein prior to determining whether the virtual camera is in a buffer based on the first distance, further comprising:
determining the center point coordinates of the target model based on all vertex coordinates of the target model in a model coordinate system;
Determining a first sphere corresponding to the target model based on the center point coordinates and all the vertex information;
Determining the buffer zone based on the radius of the first sphere and a preset buffer ratio; wherein the value of the preset buffer ratio is between 0 and 1.
3. The method of claim 2, wherein the determining the first sphere corresponding to the target model based on the center point coordinates and the all vertex information comprises:
determining maximum length information of the target model on three coordinate axes of the model coordinate system based on all the vertex coordinates;
and determining the radius of the first sphere based on the center point coordinates and the maximum length information on the three coordinate axes, and determining the first sphere taking the center point coordinates as the sphere center.
4. A method according to claim 2 or 3, wherein said determining the buffer zone based on the radius of the first sphere and a preset buffer ratio comprises:
determining the radius of a second sphere based on the preset buffer ratio and the radius of the first sphere;
determining the second sphere with the center point coordinates as a sphere center based on the radius of the second sphere;
Determining the area between the second sphere and the first sphere as the buffer area.
5. The method of claim 4, wherein the determining whether the virtual camera is in a buffer based on the first distance comprises:
determining whether the first distance is greater than a radius of the second sphere and less than a radius of the first sphere;
responsive to the first distance being greater than a radius of the second sphere and less than a radius of the first sphere, determining that the virtual camera is in the buffer.
6. The method as recited in claim 4, further comprising:
And in response to the virtual camera not being in a buffer, determining that the second distance is the first distance or zero.
7. A model tour device comprising:
The virtual camera determining module is used for determining the initial position of the virtual camera based on at least one observation point position corresponding to the target model in response to receiving an instruction for performing model tour on the target model;
the first distance module is used for moving the virtual camera according to the received displacement instruction and determining a first distance between the virtual camera and the center point of the target model;
the buffer zone judging module is used for determining whether the virtual camera is in a buffer zone or not based on the first distance;
A second distance module for determining a second distance between the virtual camera and a rotation axis of the target model according to the first distance in response to the virtual camera being in the buffer; the second distance module is configured to determine, when the second distance is determined to be the first distance or zero, that the second distance is the first distance in response to the first distance being greater than a radius of a first sphere; determining that the second distance is zero in response to the first distance being less than a radius of a second sphere;
the apparatus further comprises:
A model rotation module for determining, in response to a received rotation instruction, that the target model rotates about the virtual camera according to the rotation instruction or about the rotation axis according to the rotation instruction according to the second distance; when the second distance is 0, the virtual camera rotates around the virtual camera; when the second distance is the first distance, the virtual camera rotates around the rotating shaft; when the virtual camera is in the buffer zone, the rotating shaft approaches the virtual camera with an equal ratio of distance until the distance between the rotating shaft and the virtual camera is 0.
8. An electronic device, comprising:
A memory for storing a computer program product;
Processor for executing a computer program product stored in said memory, which, when executed, implements the model tour method according to any of the previous claims 1-6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310356304.0A CN116363337B (en) | 2023-04-04 | 2023-04-04 | Model tour method and device and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310356304.0A CN116363337B (en) | 2023-04-04 | 2023-04-04 | Model tour method and device and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116363337A CN116363337A (en) | 2023-06-30 |
CN116363337B true CN116363337B (en) | 2024-06-11 |
Family
ID=86915649
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310356304.0A Active CN116363337B (en) | 2023-04-04 | 2023-04-04 | Model tour method and device and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116363337B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107248193A (en) * | 2017-05-22 | 2017-10-13 | 北京红马传媒文化发展有限公司 | The method, system and device that two dimensional surface is switched over virtual reality scenario |
CN108355352A (en) * | 2018-01-31 | 2018-08-03 | 网易(杭州)网络有限公司 | Dummy object control method and device, electronic equipment, storage medium |
CN110045827A (en) * | 2019-04-11 | 2019-07-23 | 腾讯科技(深圳)有限公司 | The observation method of virtual objects, device and readable storage medium storing program for executing in virtual environment |
CN111415386A (en) * | 2020-03-16 | 2020-07-14 | 贝壳技术有限公司 | Shooting equipment position prompting method and device, storage medium and electronic equipment |
CN111880648A (en) * | 2020-06-19 | 2020-11-03 | 华为技术有限公司 | Three-dimensional element control method and terminal |
CN112717409A (en) * | 2021-01-22 | 2021-04-30 | 腾讯科技(深圳)有限公司 | Virtual vehicle control method and device, computer equipment and storage medium |
CN115631291A (en) * | 2022-11-18 | 2023-01-20 | 如你所视(北京)科技有限公司 | Real-time re-illumination method and apparatus, device, and medium for augmented reality |
-
2023
- 2023-04-04 CN CN202310356304.0A patent/CN116363337B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107248193A (en) * | 2017-05-22 | 2017-10-13 | 北京红马传媒文化发展有限公司 | The method, system and device that two dimensional surface is switched over virtual reality scenario |
CN108355352A (en) * | 2018-01-31 | 2018-08-03 | 网易(杭州)网络有限公司 | Dummy object control method and device, electronic equipment, storage medium |
CN110045827A (en) * | 2019-04-11 | 2019-07-23 | 腾讯科技(深圳)有限公司 | The observation method of virtual objects, device and readable storage medium storing program for executing in virtual environment |
CN111415386A (en) * | 2020-03-16 | 2020-07-14 | 贝壳技术有限公司 | Shooting equipment position prompting method and device, storage medium and electronic equipment |
CN111880648A (en) * | 2020-06-19 | 2020-11-03 | 华为技术有限公司 | Three-dimensional element control method and terminal |
CN112717409A (en) * | 2021-01-22 | 2021-04-30 | 腾讯科技(深圳)有限公司 | Virtual vehicle control method and device, computer equipment and storage medium |
CN115631291A (en) * | 2022-11-18 | 2023-01-20 | 如你所视(北京)科技有限公司 | Real-time re-illumination method and apparatus, device, and medium for augmented reality |
Non-Patent Citations (1)
Title |
---|
虚拟环境中的虚拟目标摄像机实现方法;沈洋 等;《计算机仿真》;第24卷(第8期);213-215+245 * |
Also Published As
Publication number | Publication date |
---|---|
CN116363337A (en) | 2023-06-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10949057B2 (en) | Position-dependent modification of descriptive content in a virtual reality environment | |
KR102565755B1 (en) | Electronic device for displaying an avatar performed a motion according to a movement of a feature point of a face and method of operating the same | |
US11244518B2 (en) | Digital stages for presenting digital three-dimensional models | |
WO2021223611A1 (en) | Robot control method and apparatus, and robot and storage medium | |
US9754398B1 (en) | Animation curve reduction for mobile application user interface objects | |
CN111681320B (en) | Model display method and device in three-dimensional house model | |
CN109621415A (en) | Display control method and device in 3D game, computer storage medium | |
CN113289327A (en) | Display control method and device of mobile terminal, storage medium and electronic equipment | |
KR20210030384A (en) | 3D transition | |
CN114296843A (en) | Latency determination for human interface devices | |
WO2019095360A1 (en) | Method and apparatus for handling menu in virtual scene, and storage medium | |
CN112847336A (en) | Action learning method, action learning device, storage medium and electronic equipment | |
CN116363337B (en) | Model tour method and device and electronic equipment | |
CN112206519B (en) | Method, device, storage medium and computer equipment for realizing game scene environment change | |
CN111739134B (en) | Model processing method and device for virtual character and readable storage medium | |
WO2024066756A1 (en) | Interaction method and apparatus, and display device | |
US8326442B2 (en) | Constrained navigation in a three-dimensional (3D) virtual arena | |
CN115512046B (en) | Panorama display method and device for points outside model, equipment and medium | |
CN111739135A (en) | Virtual character model processing method and device and readable storage medium | |
CN117111742A (en) | Image interaction method and device, electronic equipment and storage medium | |
CN106375188B (en) | Method, device and system for presenting interactive expressions | |
CN112435316B (en) | Method and device for preventing mold penetration in game, electronic equipment and storage medium | |
CN112473138B (en) | Game display control method and device, readable storage medium and electronic equipment | |
KR20200012561A (en) | System and method for game in virtual reality | |
CN114681918A (en) | Virtual camera control method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |