Nothing Special   »   [go: up one dir, main page]

CN117679745B - Method, system and medium for controlling virtual character orientation through multi-angle dynamic detection - Google Patents

Method, system and medium for controlling virtual character orientation through multi-angle dynamic detection Download PDF

Info

Publication number
CN117679745B
CN117679745B CN202410139032.3A CN202410139032A CN117679745B CN 117679745 B CN117679745 B CN 117679745B CN 202410139032 A CN202410139032 A CN 202410139032A CN 117679745 B CN117679745 B CN 117679745B
Authority
CN
China
Prior art keywords
rotation
angle
animation
orientation
target person
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410139032.3A
Other languages
Chinese (zh)
Other versions
CN117679745A (en
Inventor
李宏雪
陈亚南
殷超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Weisaike Network Technology Co ltd
Original Assignee
Nanjing Weisaike Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Weisaike Network Technology Co ltd filed Critical Nanjing Weisaike Network Technology Co ltd
Priority to CN202410139032.3A priority Critical patent/CN117679745B/en
Publication of CN117679745A publication Critical patent/CN117679745A/en
Application granted granted Critical
Publication of CN117679745B publication Critical patent/CN117679745B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5255Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a method, a system and a medium for dynamically detecting and controlling the orientation of virtual characters at multiple angles, belonging to the technical field of character control, wherein the method comprises the following steps: acquiring an orientation vector N of a target person and a camera view angle vector M; dynamically detecting the camera visual angle, recording a direction vector M1 when the camera visual angle starts to rotate and a direction vector M2 when the camera visual angle ends to rotate, calculating a rotation angle alpha, and detecting the holding time t when the camera visual angle stops rotating; setting a detection time T and a detection angle beta 1; if T is less than T, or T is greater than or equal to T and |alpha| is less than beta 1, the orientation vector N of the target person is kept unchanged; if T is equal to or greater than T and |alpha| is greater than beta 1, the target person is controlled to execute the rotation animation so that the direction vector N of the target person is consistent with the direction vector M2, and the execution of the rotation animation is synchronized with other users in the virtual scene. The invention is convenient to grasp the realization direction of the virtual character by adjusting the target character direction to be consistent with the visual angle direction of the rotated camera.

Description

Method, system and medium for controlling virtual character orientation through multi-angle dynamic detection
Technical Field
The invention relates to the technical field of role control, in particular to a method, a system and a medium for dynamically detecting and controlling the direction of virtual characters at multiple angles.
Background
The follow camera viewing angle adjustment technique is a technique commonly used in video production and game development. It allows the camera to dynamically adjust the viewing angle according to specific rules or user inputs in order to better follow a specific object or scene. In game development, the following camera view angle adjustment technique is often used to realize view angle control of player characters. For example, in a third person action game, the camera typically follows the movement of the player character, keeps the character always in the center of the screen, and adjusts appropriately according to the needs of the scene. This may provide a better gaming experience and may make it easier for players to take care of the character and environment.
Particularly, in the third view angle, in order to ensure that the moving operations such as the forward and backward movement of the character can be used as a reference according to the current view angle direction of the user, the character always needs to keep consistent with the view angle direction of the camera, but in practical application, there are various use requirements, sometimes the user only wants to rotate the view angle of the camera, but does not need to rotate the direction of the character, the conventional solution is to disengage the rotation remote control of the view angle of the camera from the direction rotation remote control of the character by pressing a designated key, so that the direction of the character can be freely rotated to ensure that the direction of the character is unchanged. Thus, the existing avatar orientation control cannot meet various demands.
Disclosure of Invention
The invention aims to solve the problem of controlling the orientation of the character, and provides a method, a system and a medium for controlling the orientation of the virtual character by multi-angle dynamic detection.
In a first aspect, the present invention achieves the above object by a method for dynamically detecting and controlling the orientation of a virtual character at multiple angles, the method comprising the steps of:
acquiring an orientation vector N and a camera view angle vector M of a target person in a virtual scene;
dynamically detecting the camera view angle, wherein the dynamic detection comprises view angle rotation detection and operation stop detection, the view angle rotation detection is used for recording a direction vector M1 when the camera view angle starts to rotate and a direction vector M2 when the camera view angle ends to rotate, a rotation angle alpha is calculated according to the M1 and the M2, and the operation stop detection is used for detecting the holding time t of stopping rotation of the camera view angle;
setting a detection time T and a detection angle beta 1;
if T is smaller than T, the orientation vector N of the target person is kept unchanged;
if T is not less than T and |alpha| is smaller than beta 1, the orientation vector N of the target person is kept unchanged;
if T is equal to or greater than T and |α| is equal to or greater than β1, the target person is controlled to execute the rotation animation so that the direction vector N of the target person coincides with the direction vector M2, and the execution of the rotation animation is synchronized with other users in the virtual scene.
Preferably, the detection time T is set to be between 1 and 2s, and the detection angle β1 is set to be between 0 and 1 °.
Preferably, the method further comprises setting a detection angle β2, and β1 < β2, the rotation animation comprising a first animation and a second animation, then:
if T is not less than T and beta 1 is not less than beta 2, controlling the target person to execute the first animation so that the direction vector N of the target person is consistent with the direction vector M2;
if t+.gtoreq.t, |α+|β2, the control target person performs the second animation so that the direction vector N of the target person matches the direction vector M2.
Preferably, the detection angle β2 is set to be between 1 and 10 °.
Preferably, the rotation animation is played through a pre-designed animation model, the rotation angle alpha is input to the animation model, and the orientation vector N and the orientation vector M2 at the end of playing the rotation animation by the target person are consistent.
Preferably, the method further comprises setting a direction vector M1 at the start of rotation as a reference axis, and if the direction vector M2 at the end of rotation is located at the left half of the reference axis, the calculated rotation angle α is less than 0 °; if the direction vector M2 at the end of rotation is positioned at the right half of the reference shaft, the calculated rotation angle alpha & gt 0 degrees; if the rotation angle alpha of the input animation model is less than 0 degrees, executing the rotation animation to rotate leftwards, and if the rotation angle alpha of the input animation model is more than 0 degrees, executing the rotation animation to rotate rightwards.
In a second aspect, the present invention achieves the above object by a system for dynamically detecting and controlling the orientation of a virtual character at multiple angles, the system comprising:
the direction acquisition unit is used for acquiring an orientation vector N of a target person in the virtual scene and a camera view angle vector M;
a view angle detection unit for dynamically detecting a camera view angle, the dynamic detection including a view angle rotation detection for recording a direction vector M1 when the camera view angle starts to rotate and a direction vector M2 when the camera view angle ends to rotate, calculating a rotation angle α from M1 and M2, and an operation stop detection for detecting a holding time t when the camera view angle stops rotating;
an orientation control unit for setting a detection time T and a detection angle β1;
if T is smaller than T, the orientation vector N of the target person is kept unchanged;
if T is not less than T and |alpha| is smaller than beta 1, the orientation vector N of the target person is kept unchanged;
if T is equal to or greater than T and |α| is equal to or greater than β1, the target person is controlled to execute the rotation animation so that the direction vector N of the target person coincides with the direction vector M2, and the execution of the rotation animation is synchronized with other users in the virtual scene.
Preferably, the orientation control unit is further configured to set a detection angle β2, and β1 is less than β2, and the rotation animation includes a first animation and a second animation, then:
if T is not less than T and beta 1 is not less than beta 2, controlling the target person to execute the first animation so that the direction vector N of the target person is consistent with the direction vector M2;
if t+.gtoreq.t, |α+|β2, the control target person performs the second animation so that the direction vector N of the target person matches the direction vector M2.
Preferably, the direction control unit is further configured to set a direction vector M1 at the start of rotation as a reference axis, and if the direction vector M2 at the end of rotation is located at the left half of the reference axis, the calculated rotation angle α is less than 0 °; if the direction vector M2 at the end of rotation is positioned at the right half of the reference shaft, the calculated rotation angle alpha & gt 0 degrees; if the rotation angle alpha of the input animation model is less than 0 degrees, executing the rotation animation to rotate leftwards, and if the rotation angle alpha of the input animation model is more than 0 degrees, executing the rotation animation to rotate rightwards.
In a third aspect, the present invention achieves the above object by a medium having stored thereon a computer program which, when executed by a processor, implements the method for controlling the orientation of a virtual character by multi-angle dynamic detection as described in the first aspect.
Compared with the prior art, the invention has the beneficial effects that: when the rotation stopping time of the camera visual angle exceeds the detection time, the invention controls the orientation of the target person to be consistent with the camera visual angle direction, so that other users in the scene can see the gazing direction of the person, and before controlling the orientation of the person, the invention judges the rotation angle, when the rotation angle does not exceed the set detection angle, the orientation of the person does not rotate although the camera visual angle rotates, and only when the rotation angle exceeds the set detection angle, the orientation of the target person is consistent with the camera visual angle direction, and adopts the detection time and the detection angle to realize whether the orientation of the target person needs to be adjusted, thereby reducing the error and enabling the orientation adjustment of the target person to be more accurate.
Drawings
FIG. 1 is a flow chart of a method for dynamically detecting and controlling the orientation of a virtual character at multiple angles according to the present invention.
Fig. 2 is a schematic view of the range of the region for performing the first animation and the second animation according to the present invention.
Fig. 3 is a schematic view of the range of the area for performing left and right turns of the present invention.
FIG. 4 is a schematic diagram of a system for dynamically detecting and controlling the orientation of virtual characters at multiple angles according to the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Example 1
As shown in fig. 1, a method for dynamically detecting and controlling the orientation of a virtual character at multiple angles includes the following steps:
in the step S1, a direction vector N of a target person in the virtual scene and a camera view angle vector M are obtained, wherein in the virtual scene, the direction vector N is a normal vector of a center point of the virtual person toward the right front, and the camera view angle vector M is a normal vector of a center point of the camera toward the right front. In the virtual scene, the movement of the virtual character and the rotation of the camera view angle are independently controlled, taking a computer end as an example, the keyboard can control the displacement of the virtual character, the mouse can control the camera view angle, along with the movement of the mouse, the picture seen by a user rotates, at the moment, the camera view angle vector M changes, if the virtual character also rotates, the orientation vector N also changes, and the preparation of the orientation of the character following the camera view angle is made by firstly acquiring the orientation vector N and the camera view angle vector M.
Step S2, dynamically detecting the camera visual angle, wherein the dynamic detection comprises visual angle rotation detection and operation stop detection, the visual angle rotation detection is used for recording a direction vector M1 when the camera visual angle starts to rotate and a direction vector M2 when the camera visual angle ends to rotate, a rotation angle alpha is calculated according to the M1 and the M2, and the operation stop detection is used for detecting the holding time t when the camera visual angle stops rotating; the method comprises the steps of detecting the camera visual angle in real time, including visual angle rotation detection and operation stop detection, wherein the visual angle rotation detection is used for detecting whether a user operates a visual angle picture to deflect, if deflection occurs, the detection does not record a deflection track, but only records a direction vector M1 at the beginning and a direction vector M2 at the end, so that the deflection angle at this time, namely a rotation angle alpha, can be obtained according to the two direction vectors, the rotation animation is played through a pre-designed animation model, the rotation angle alpha is input to the animation model, and the direction vector N at the end after the target character plays the rotation animation is consistent with the direction vector M2.
The rotation angle alpha is used for making reference for the subsequent person orientation control, so that the orientation of the person can be consistent with the direction of the visual angle of the rotated camera, and the rotation angle alpha is needed to determine how much the person needs to rotate; the operation stop detection is to detect whether the target person needs to be controlled to rotate or not after the user operates the visual angle to deflect, namely, the holding time t is utilized to determine whether the target person needs to be controlled to rotate, if the holding time t is too short, the direction of the target person does not need to be controlled to change, otherwise, the control is needed, and the operation stop detection has certain benefits in practical application, for example, when the user operates the virtual person to face a target, the visual angle picture at the moment also faces the target, but the user does not want to leave the target, only wants to see the environment around the target, the user can deflect until the user returns to the original position through the operation visual angle, and certain pause is necessarily generated when the visual angle deflects to a region, and the holding time t is not used for operation, because the user only wants to observe the surrounding environment for a long time, the temporary stop does not cause the direction of the target person to rotate, so that the psychological expectation of the user is achieved, and the detection of the holding time t is more consistent with the practical use condition.
The rotation angle alpha is used as a reference for the consistency of orientation control and is also used for judging whether the target person needs to be controlled to rotate, because in practical application, a user controls the virtual person through a computer end or a head display device, fingers touch a mouse or the head display device shakes to cause a certain tiny deflection of a camera visual angle, and the tiny deflection is not the intention that the user really wants to rotate the camera visual angle, the orientation change of the target person does not need to be controlled, and the method can be more suitable for the practical use condition
Step S3, setting a detection time T and a detection angle beta 1; the detection time T is used for comparison, and in the step S2, it is mentioned that the holding time T is required to be controlled to be consistent with the last direction of the camera view angle only when reaching a certain value, so the detection time T is used for measuring time, a timer is generally used for counting time, a time threshold is set on the timer, that is, the detection time T is set, the timer starts to work after the rotation of the camera view angle is stopped, and when the timer exceeds the set threshold, the holding time T is indicated to be exceeding the detection time T, otherwise, the holding time T is not exceeded.
The detection angle β1 is an angle for detecting rotation of the camera view angle, and the above-mentioned step S2 mentions that the rotation angle α is calculated based on the direction vector M1 at the start and the direction vector M2 at the end, and in the case where the value of the rotation angle α is small, the orientation of the target person is not required to be rotated, so that setting the detection angle β1 in comparison with the rotation angle α indicates that the orientation of the target person is required to be controlled only when the detection angle β1 is exceeded.
The manner of judging whether or not it is necessary to control the rotation of the target person's orientation using the detection time T and the detection angle β1 is specifically as follows:
a1, if T is smaller than T, the orientation vector N of the target person is kept unchanged, and the target person orientation control is performed on the premise that when the user stops rotating the camera view angle for a certain time, the step only needs to judge whether the holding time T reaches the detection time T, does not need to judge the rotation angle alpha, and does not need to control the target person orientation when the holding time T does not reach the standard of the detection time T.
A2, if T is larger than or equal to T and |alpha|β1, the orientation vector N of the target person is kept unchanged, wherein the step is different from the previous step in that the holding time T reaches the standard of the detection time T, the rotation angle alpha is required to be judged, the rotation angle alpha is smaller than the detection angle β1, and the orientation of the target person is not required to be controlled;
a3, if T is equal to or greater than T and |alpha|β1, controlling the target person to execute the rotation animation so that the direction vector N of the target person is consistent with the direction vector M2, and synchronizing the execution of the rotation animation to other users in the virtual scene.
It is noted that in the above-described step A2 and step A3, the rotation angle α is compared with the detection angle β1 by |α| which is the absolute value α and β1, and since the rotation angle α may be set to fall within the range of not only 0 to 360 ° but also 0 to 180 °, in the second case, if the rotation angle α exceeds 180 °, the value of α is smaller than 0, and therefore |α| is selected when compared with β1.
In order to make the judgment of the detection time T and the detection angle beta 1 more accord with the actual use condition, the detection time T is set between 1 and 2s, the detection angle beta 1 is set between 0 and 1 DEG, because the time of 1 to 2s is enough for a user to stay to observe the surrounding environment of a target, the judgment that the user wants to rotate the view angle of the camera to change the direction of the target person can be made only when the time exceeds the time, and likewise, the judgment that the view angle of the camera wants to rotate to change the direction of the target person can be made only when the angle exceeds the time to provide the shaking range allowing the view angle of the camera.
The playing of the rotary animation is a continuous process, when the rotary animation is designed, the playing time length of the complete action is set, the action of rotating can be realized by inputting the rotating angle, and as the playing time length is fixed, the smaller the rotating angle is, the slower the rotating speed of the target person is, whereas the larger the rotating angle is, the faster the rotating speed of the target person is, so that if only one action is adopted to realize the rotation of the target person, the rotating action is easy to lead a user to view the rotation not to be coordinated enough when the angle is smaller, and the following method is provided: as shown in fig. 2, the method further includes setting a detection angle β2, and β1 and β2, wherein the rotation animation includes a first animation and a second animation, the main characteristic of designing the first animation is that the motion amplitude of the person is small, so that the person is not easy to generate uncoordinated sense even under the condition of small rotation angle, and the main characteristic of designing the second motion is that the motion amplitude of the person is large, because the rotation speed of the task is high under the condition of large rotation angle, the uncoordinated sense is not easy to generate when the motion amplitude is large.
The following method is made on how to judge whether the target person should perform the first animation or the second animation:
if T is not less than T and beta 1 is not less than beta 2, controlling the target person to execute the first animation so that the direction vector N of the target person is consistent with the direction vector M2;
if t+.gtoreq.t, |α+|β2, the control target person performs the second animation so that the direction vector N of the target person matches the direction vector M2. The detection angle beta 2 is set to be between 1 and 10 degrees, and the rotation angle in the interval is summarized as the case of small rotation angle and the case of no less than 10 degrees is summarized as the case of large rotation angle.
In step S2, it has been mentioned that the view angle rotation detection does not record the trajectory of deflection, only the direction vector M1 at the beginning and the direction vector M2 at the end are recorded, and also in step A2 and step A3, it is mentioned that the rotation angle α may be set to fall within the range of not only 0 to 360 ° but also 0 to 180 °, so that in order to improve the orientation efficiency of the control target person, the rotation angle α is selected to fall within the range of 0 to 180 ° optimally because the rotation angle does not exceed 180 °, the rotation efficiency is better, but the selection of the range of 0 to 180 °, a problem that if the user manipulates the camera view angle rotation exceeding 180 °, the calculated rotation angle α is a negative value, at which time the orientation of the control target person is inconvenient.
Thus, in response to the above problems, the following improvements have been made:
as shown in fig. 3, the method further includes setting a direction vector M1 at the start of rotation as a reference axis, and if the direction vector M2 at the end of rotation is located at the left half of the reference axis, calculating a rotation angle α < 0 °; if the direction vector M2 at the end of rotation is positioned at the right half of the reference shaft, the calculated rotation angle alpha & gt 0 degrees; if the rotation angle alpha of the input animation model is less than 0 degrees, executing the rotation animation to rotate leftwards, if the rotation angle alpha of the input animation model is greater than 0 degrees, executing the rotation animation to rotate rightwards, dividing the rotation area of the target person into two parts, namely a left area and a right area, taking the direction vector M1 before the rotation of the camera view angle as a central axis, and enabling the angle of the right rotation of the camera view angle to exceed 180 degrees, enabling the camera view angle to reach the left area from the right area, so that the rotation animation is designed to be in left rotation and right rotation, and judging that the direction of the control target person turns leftwards at the moment is most efficient.
Example 2
As shown in fig. 4, a system for dynamically detecting and controlling the orientation of a virtual character at multiple angles, the system comprising:
the direction acquisition unit is used for acquiring an orientation vector N of a target person in the virtual scene and a camera view angle vector M;
a view angle detection unit for dynamically detecting a camera view angle, the dynamic detection including a view angle rotation detection for recording a direction vector M1 when the camera view angle starts to rotate and a direction vector M2 when the camera view angle ends to rotate, calculating a rotation angle α from M1 and M2, and an operation stop detection for detecting a holding time t when the camera view angle stops rotating;
an orientation control unit for setting a detection time T and a detection angle beta 1, wherein the detection time T is between 1 and 2s, and the detection angle beta 1 is between 0 and 1 degrees;
if T is smaller than T, the orientation vector N of the target person is kept unchanged;
if T is not less than T and |alpha| is smaller than beta 1, the orientation vector N of the target person is kept unchanged;
if T is equal to or greater than T and |alpha| is greater than beta 1, controlling the target person to execute the rotation animation so that the direction vector N of the target person is consistent with the direction vector M2, and synchronizing the execution of the rotation animation to other users in the virtual scene.
The orientation control unit is further used for setting a detection angle beta 2, and beta 1 < beta 2, wherein the detection angle beta 2 is between 1 and 10 degrees, the rotation animation comprises a first animation and a second animation, and then:
if T is not less than T and beta 1 is not less than beta 2, controlling the target person to execute the first animation so that the direction vector N of the target person is consistent with the direction vector M2;
if t+.gtoreq.t, |α+|β2, the control target person performs the second animation so that the direction vector N of the target person matches the direction vector M2.
The rotary animation is played through a pre-designed animation model, a rotation angle alpha is input to the animation model, and the direction vector N and the direction vector M2 at the end of playing the rotary animation by the target person are consistent.
The direction control unit is also used for setting a direction vector M1 at the beginning of rotation as a reference axis, and if the direction vector M2 at the end of rotation is positioned at the left half of the reference axis, the calculated rotation angle alpha is less than 0 degrees; if the direction vector M2 at the end of rotation is positioned at the right half of the reference shaft, the calculated rotation angle alpha & gt 0 degrees; if the rotation angle alpha of the input animation model is less than 0 degrees, executing the rotation animation to rotate leftwards, and if the rotation angle alpha of the input animation model is more than 0 degrees, executing the rotation animation to rotate rightwards.
Since embodiment 2 is substantially the same as embodiment 1, the operation principle of each unit in embodiment 2 will not be described in detail.
Example 3
The embodiment provides a medium, which comprises a stored program area and a stored data area, wherein the stored program area can store an operating system, a program required by running an instant messaging function and the like; the storage data area can store various instant messaging information, operation instruction sets and the like. A computer program is stored in the storage program area, which when executed by a processor implements the method of controlling the orientation of a virtual character by multi-angle dynamic detection as described in embodiment 1. The processor may include one or more Central Processing Units (CPUs) or a digital processing unit or the like.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.
Furthermore, it should be understood that although the present disclosure describes embodiments, not every embodiment is provided with a separate embodiment, and that this description is provided for clarity only, and that the disclosure is not limited to the embodiments described in detail below, and that the embodiments described in the examples may be combined as appropriate to form other embodiments that will be apparent to those skilled in the art.

Claims (10)

1. A method for controlling the orientation of a virtual character through multi-angle dynamic detection, comprising the steps of:
acquiring an orientation vector N and a camera view angle vector M of a target person in a virtual scene;
dynamically detecting the camera view angle, wherein the dynamic detection comprises view angle rotation detection and operation stop detection, the view angle rotation detection is used for recording a direction vector M1 when the camera view angle starts to rotate and a direction vector M2 when the camera view angle ends to rotate, a rotation angle alpha is calculated according to the M1 and the M2, and the operation stop detection is used for detecting the holding time t of stopping rotation of the camera view angle;
setting a detection time T and a detection angle beta 1;
if T is smaller than T, the orientation vector N of the target person is kept unchanged;
if T is not less than T and |alpha| is smaller than beta 1, the orientation vector N of the target person is kept unchanged;
if T is equal to or greater than T and |α| is equal to or greater than β1, the target person is controlled to execute the rotation animation so that the direction vector N of the target person coincides with the direction vector M2, and the execution of the rotation animation is synchronized with other users in the virtual scene.
2. The method for dynamically detecting and controlling the orientation of a virtual character at multiple angles according to claim 1, wherein the detection time T is set to be between 1 and 2s and the detection angle β1 is set to be between 0 and 1 °.
3. The method of multi-angle dynamic detection control of virtual character orientation according to claim 2, further comprising setting a detection angle β2 and β1_β2, wherein the rotational animation comprises a first animation and a second animation, then:
if T is not less than T and beta 1 is not less than beta 2, controlling the target person to execute the first animation so that the direction vector N of the target person is consistent with the direction vector M2;
if t+.gtoreq.t, |α+|β2, the control target person performs the second animation so that the direction vector N of the target person matches the direction vector M2.
4. The method for controlling the orientation of a virtual character according to claim 3, wherein the detection angle β2 is set to be between 1 ° and 10 °.
5. A method for dynamically detecting and controlling the orientation of a virtual character at multiple angles according to claim 1 or 3, wherein the rotation animation is played through a pre-designed animation model, a rotation angle α is input to the animation model, and the orientation vector N at the end of playing the rotation animation of the target character coincides with the orientation vector M2.
6. The method of claim 5, further comprising setting a direction vector M1 at the start of rotation as a reference axis, and if the direction vector M2 at the end of rotation is located at the left half of the reference axis, calculating a rotation angle α < 0 °; if the direction vector M2 at the end of rotation is positioned at the right half of the reference shaft, the calculated rotation angle alpha & gt 0 degrees; if the rotation angle alpha of the input animation model is less than 0 degrees, executing the rotation animation to rotate leftwards, and if the rotation angle alpha of the input animation model is more than 0 degrees, executing the rotation animation to rotate rightwards.
7. A system for controlling the orientation of a virtual character by multi-angle dynamic detection, the system comprising:
the direction acquisition unit is used for acquiring an orientation vector N of a target person in the virtual scene and a camera view angle vector M;
a view angle detection unit for dynamically detecting a camera view angle, the dynamic detection including a view angle rotation detection for recording a direction vector M1 when the camera view angle starts to rotate and a direction vector M2 when the camera view angle ends to rotate, calculating a rotation angle α from M1 and M2, and an operation stop detection for detecting a holding time t when the camera view angle stops rotating;
an orientation control unit for setting a detection time T and a detection angle β1;
if T is smaller than T, the orientation vector N of the target person is kept unchanged;
if T is not less than T and |alpha| is smaller than beta 1, the orientation vector N of the target person is kept unchanged;
if T is equal to or greater than T and |α| is equal to or greater than β1, the target person is controlled to execute the rotation animation so that the direction vector N of the target person coincides with the direction vector M2, and the execution of the rotation animation is synchronized with other users in the virtual scene.
8. The system for dynamically detecting and controlling the orientation of a virtual character at multiple angles according to claim 7, wherein the orientation control unit is further configured to set a detection angle β2, and β1 is less than β2, and the rotation animation includes a first animation and a second animation, then:
if T is not less than T and beta 1 is not less than beta 2, controlling the target person to execute the first animation so that the direction vector N of the target person is consistent with the direction vector M2;
if t+.gtoreq.t, |α+|β2, the control target person performs the second animation so that the direction vector N of the target person matches the direction vector M2.
9. The system for dynamically detecting and controlling the orientation of a virtual character at multiple angles according to claim 8, wherein the orientation control unit is further configured to set a direction vector M1 at the start of rotation as a reference axis, and if the direction vector M2 at the end of rotation is located at the left half of the reference axis, the calculated rotation angle α is less than 0 °; if the direction vector M2 at the end of rotation is positioned at the right half of the reference shaft, the calculated rotation angle alpha & gt 0 degrees; if the rotation angle alpha of the input animation model is less than 0 degrees, executing the rotation animation to rotate leftwards, and if the rotation angle alpha of the input animation model is more than 0 degrees, executing the rotation animation to rotate rightwards.
10. A medium having stored thereon a computer program which, when executed by a processor, implements the method of multi-angle dynamic detection control of virtual character orientation as claimed in any one of claims 1-6.
CN202410139032.3A 2024-02-01 2024-02-01 Method, system and medium for controlling virtual character orientation through multi-angle dynamic detection Active CN117679745B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410139032.3A CN117679745B (en) 2024-02-01 2024-02-01 Method, system and medium for controlling virtual character orientation through multi-angle dynamic detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410139032.3A CN117679745B (en) 2024-02-01 2024-02-01 Method, system and medium for controlling virtual character orientation through multi-angle dynamic detection

Publications (2)

Publication Number Publication Date
CN117679745A CN117679745A (en) 2024-03-12
CN117679745B true CN117679745B (en) 2024-04-12

Family

ID=90135652

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410139032.3A Active CN117679745B (en) 2024-02-01 2024-02-01 Method, system and medium for controlling virtual character orientation through multi-angle dynamic detection

Country Status (1)

Country Link
CN (1) CN117679745B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105597311A (en) * 2015-12-25 2016-05-25 网易(杭州)网络有限公司 Method and device for controlling camera in 3d game
CN112451969A (en) * 2020-12-04 2021-03-09 腾讯科技(深圳)有限公司 Virtual object control method and device, computer equipment and storage medium
CN115317913A (en) * 2022-08-18 2022-11-11 网易(杭州)网络有限公司 Role movement control method and device, electronic equipment and readable storage medium
CN116020119A (en) * 2022-12-28 2023-04-28 网易(杭州)网络有限公司 Visual field control method and device in virtual scene and electronic terminal
CN116440491A (en) * 2022-12-30 2023-07-18 网易(杭州)网络有限公司 Interaction method, device, equipment and storage medium of handle and terminal equipment
CN117170504A (en) * 2023-11-01 2023-12-05 南京维赛客网络科技有限公司 Method, system and storage medium for viewing with person in virtual character interaction scene
CN117244249A (en) * 2023-11-03 2023-12-19 北京字跳网络技术有限公司 Multimedia data generation method and device, readable medium and electronic equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116440495A (en) * 2022-01-07 2023-07-18 腾讯科技(深圳)有限公司 Scene picture display method and device, terminal and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105597311A (en) * 2015-12-25 2016-05-25 网易(杭州)网络有限公司 Method and device for controlling camera in 3d game
CN112451969A (en) * 2020-12-04 2021-03-09 腾讯科技(深圳)有限公司 Virtual object control method and device, computer equipment and storage medium
CN115317913A (en) * 2022-08-18 2022-11-11 网易(杭州)网络有限公司 Role movement control method and device, electronic equipment and readable storage medium
CN116020119A (en) * 2022-12-28 2023-04-28 网易(杭州)网络有限公司 Visual field control method and device in virtual scene and electronic terminal
CN116440491A (en) * 2022-12-30 2023-07-18 网易(杭州)网络有限公司 Interaction method, device, equipment and storage medium of handle and terminal equipment
CN117170504A (en) * 2023-11-01 2023-12-05 南京维赛客网络科技有限公司 Method, system and storage medium for viewing with person in virtual character interaction scene
CN117244249A (en) * 2023-11-03 2023-12-19 北京字跳网络技术有限公司 Multimedia data generation method and device, readable medium and electronic equipment

Also Published As

Publication number Publication date
CN117679745A (en) 2024-03-12

Similar Documents

Publication Publication Date Title
EP1552375B1 (en) Man-machine interface using a deformable device
US6191784B1 (en) User interface system and method for controlling playback time-based temporal digital media
US5779548A (en) Game apparatus and method of replaying game
CN104159016B (en) Cloud platform control system, method and device
JP2012221498A (en) System and method for providing feedback by user gaze and gestures
JP4848000B2 (en) GAME DEVICE, GAME PROCESSING METHOD, AND PROGRAM
JP2001149643A (en) Object display method in three-dimensional game, information recording medium, and entertainment device
JP2001246161A (en) Device and method for game using gesture recognizing technic and recording medium storing program to realize the method
US10695675B2 (en) System and method for creation and control of user interfaces for interaction with video content
JPH10225572A (en) Display method and control method in video game device
JP2002219278A (en) Information processing method, program execution device, information processing program for execution by computer, and storage medium
JP2001075553A (en) Screen regenerating device and comuter readable recording medium
CN107547930A (en) A kind of video play controller and control method based on mobile device rotation detection
CN117679745B (en) Method, system and medium for controlling virtual character orientation through multi-angle dynamic detection
US20240100424A1 (en) Bifurcation of gameplay between mobile and non-mobile play with intelligent game state saving, and startups
WO2023221923A1 (en) Video processing method and apparatus, electronic device and storage medium
JP2011039895A (en) Virtual space display device, viewpoint setting method, and program
WO2022151864A1 (en) Virtual reality device
CN116017082A (en) Information processing method and electronic equipment
JP2907200B2 (en) Game device having game replay function and method thereof
US11745098B2 (en) Remote play using a local projector
WO2020170831A1 (en) Information processing device, information processing method, and program
CN118672455A (en) Nasal endoscope reading playing control system
US20220245887A1 (en) Image processing device and image processing method
JP4155482B2 (en) GAME DEVICE AND COMPUTER-READABLE RECORDING MEDIUM CONTAINING GAME PROGRAM

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant