Nothing Special   »   [go: up one dir, main page]

WO2019201047A1 - 在虚拟环境中进行视角调整的方法、装置及可读存储介质 - Google Patents

在虚拟环境中进行视角调整的方法、装置及可读存储介质 Download PDF

Info

Publication number
WO2019201047A1
WO2019201047A1 PCT/CN2019/078756 CN2019078756W WO2019201047A1 WO 2019201047 A1 WO2019201047 A1 WO 2019201047A1 CN 2019078756 W CN2019078756 W CN 2019078756W WO 2019201047 A1 WO2019201047 A1 WO 2019201047A1
Authority
WO
WIPO (PCT)
Prior art keywords
viewing angle
view
virtual object
camera model
virtual environment
Prior art date
Application number
PCT/CN2019/078756
Other languages
English (en)
French (fr)
Inventor
王晗
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2019201047A1 publication Critical patent/WO2019201047A1/zh
Priority to US16/937,332 priority Critical patent/US11151773B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/003Simulators for teaching or training purposes for military purposes and tactics
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5255Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • A63F2300/6676Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera by dedicated player input
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/003Maps
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/006Simulators for teaching or training purposes for locating or ranging of objects

Definitions

  • the embodiments of the present invention relate to the field of computer graphics processing, and in particular, to a method, an apparatus, and a readable storage medium for performing perspective adjustment in a virtual environment.
  • virtual reality applications such as: virtual reality applications, Third-Personal Shooting Game (TPS), First-person shooting game (FPS), multiplayer online tactical competitive games ( In Multiplayer Online Battle Arena Games (MOBA)
  • TPS Third-Personal Shooting Game
  • FPS First-person shooting game
  • MOBA multiplayer online tactical competitive games
  • display elements such as virtual objects, virtual objects, and grounds use stereoscopic models to achieve effects.
  • the user can rotate the perspective of the virtual object in the virtual environment by dragging.
  • the user can control the virtual object by sliding the joystick control with the left hand and sliding on the right screen, wherein the joystick control is used to control the virtual object to be oriented unchanged.
  • Operations such as forward, backward, left shift, or right shift, and the right screen is used to rotate the direction in which the virtual object faces in the virtual environment (ie, the view direction).
  • the state of the virtual object is also changed, that is, the model of the virtual object needs to be rotated simultaneously in the virtual environment during the rotation of the perspective, that is, the orientation of the virtual object. Rotation is also performed.
  • the rotation angle of view changes the direction in which the virtual object walks or runs, which makes the operation of the virtual environment more inconvenient.
  • the embodiment of the present invention provides a method, a device, and a readable storage medium for performing perspective adjustment in a virtual environment, which can solve the problem that the adjustment angle of view changes the direction in which the virtual object walks or runs, and each operation has an influence on each other.
  • the technical solution is as follows:
  • a method for adjusting a perspective in a virtual environment which is applied to a terminal, and the method includes:
  • first view picture Displaying a first view picture, where the first view picture is a picture when the virtual object is viewed in the first view direction in the virtual environment, and the first view picture includes the virtual object having the first orientation, and the first view picture is also superimposed and displayed Angle adjustment control;
  • the second view picture is displayed.
  • the second view picture is a picture when the virtual object is viewed in the second view direction in the virtual environment, and the second view picture includes the virtual object having the first orientation.
  • an apparatus for performing viewing angle adjustment in a virtual environment comprising:
  • a display module configured to display a first perspective image, where the first perspective image is a virtual object in a virtual environment, wherein the first perspective image includes a virtual object having a first orientation, and the first perspective image
  • the viewing angle adjustment control is also superimposed on the display
  • a receiving module configured to receive a drag instruction for a viewing angle adjustment control
  • An adjustment module configured to adjust a direction of the first viewing angle according to the dragging instruction to obtain a second viewing angle direction
  • the display module is further configured to display a second view screen, where the second view picture is a picture when the virtual object is viewed in the second view direction in the virtual environment, and the second view picture includes the virtual object having the first orientation.
  • a terminal comprising a processor and a memory, wherein the memory stores at least one instruction, at least one program, a code set or a set of instructions, the at least one instruction, the at least one program, the The code set or the instruction set is loaded and executed by the processor to implement a method for performing angle of view adjustment in a virtual environment as described in the above embodiment of the present application.
  • a computer readable storage medium having stored therein at least one instruction, at least one program, a code set, or a set of instructions, the at least one instruction, the at least one program,
  • the code set or the instruction set is loaded and executed by the processor to implement a method for performing view adjustment in a virtual environment as described in the above-described embodiments of the present application.
  • a computer program product that, when executed on a computer, causes the computer to perform a method of viewing angle adjustment in a virtual environment as described in the above-described embodiments of the present application.
  • the viewing angle adjustment control when the user controls the viewing angle adjustment control on the first user interface, the effect of adjusting the viewing angle direction of the viewing virtual environment without changing the orientation of the virtual object can be achieved, that is, the virtual object is maintained. While the character is oriented, while maintaining the original motion posture, the user can observe the other viewing angles of the virtual environment by controlling the viewing angle adjustment control, that is, the user does not change the virtual object in the virtual environment because of observing other viewing directions.
  • the direction of travel improves the user's convenience when viewing the virtual environment, and avoids the interaction between the angle adjustment operation and the motion track of the virtual object.
  • FIG. 1 is a schematic diagram of a camera model provided by an exemplary embodiment of the present application.
  • FIG. 2 is a structural block diagram of an electronic device provided by an exemplary embodiment of the present application.
  • FIG. 3 is a structural block diagram of a computer system according to an exemplary embodiment of the present application.
  • FIG. 4 is a flowchart of a method for performing perspective adjustment in a virtual environment according to an exemplary embodiment of the present application
  • FIG. 5 is a schematic diagram of a user interface provided based on the embodiment illustrated in FIG. 4; FIG.
  • FIG. 6 is a schematic diagram of a user interface in a viewing angle adjustment process provided based on the embodiment illustrated in FIG. 4;
  • FIG. 7 is a flowchart of a method for performing perspective adjustment in a virtual environment according to another exemplary embodiment of the present application.
  • FIG. 8 is a schematic diagram of a drag direction provided based on the embodiment illustrated in FIG. 7; FIG.
  • FIG. 9 is a schematic diagram of another drag direction provided based on the embodiment illustrated in FIG. 7; FIG.
  • FIG. 10 is a schematic diagram showing a correspondence relationship between a viewing angle picture and a drag direction according to the embodiment shown in FIG. 7;
  • FIG. 11 is a flowchart of a method for performing perspective adjustment in a virtual environment according to another exemplary embodiment of the present application.
  • FIG. 12 is a flowchart of a method for performing perspective adjustment in a virtual environment according to another exemplary embodiment of the present application.
  • FIG. 13 is a flowchart of a method for performing perspective adjustment in a virtual environment according to another exemplary embodiment of the present application.
  • FIG. 14 is a structural block diagram of an apparatus for performing perspective adjustment in a virtual environment according to an exemplary embodiment of the present application.
  • FIG. 15 is a structural block diagram of an apparatus for performing perspective adjustment in a virtual environment according to another exemplary embodiment of the present application.
  • FIG. 16 is a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
  • Virtual environment A virtual environment that is displayed (or provided) when an application runs on a terminal.
  • the virtual environment can be a real-world simulation environment, a semi-simulated semi-fictional three-dimensional environment, or a purely fictitious three-dimensional environment.
  • the virtual environment may be any one of a two-dimensional virtual environment, a two-dimensional virtual environment, and a three-dimensional virtual environment.
  • the following embodiments are exemplified by the virtual environment being a three-dimensional virtual environment, but are not limited thereto.
  • the virtual environment is further configured to perform a virtual environment battle between at least two virtual characters.
  • Virtual object refers to an active object in a virtual environment.
  • the movable object includes at least one of a virtual character, a virtual creature, and an anime character.
  • the virtual object is a three-dimensional model created based on animated bone technology.
  • Each virtual object has its own shape and volume in a three-dimensional virtual environment, occupying a part of the space in the three-dimensional virtual environment.
  • Viewing direction refers to the viewing direction when viewing in the virtual environment with the first person perspective or the third person perspective of the virtual object.
  • the viewing direction refers to a direction in which the virtual object is observed in a virtual environment in a third person perspective.
  • the viewing direction is a direction when the virtual object is observed by the camera model in the virtual environment.
  • Camera model is a model located around a virtual object in a virtual environment.
  • the camera model When the first person perspective is adopted, the camera model is located near the head of the virtual object or at the head of the virtual object.
  • the camera model It can be located behind the virtual object and bound to the virtual object, or can be located at a preset distance from the virtual object.
  • the camera model can observe the virtual object located in the virtual environment from different angles, optionally When the third person perspective is the first person's over-the-shoulder perspective, the camera model is located behind the virtual object (such as the head and shoulders of the avatar).
  • the camera model automatically follows the virtual object in the virtual environment, that is, when the position of the virtual object in the virtual environment changes, the camera model follows the position of the virtual object in the virtual environment and changes simultaneously, and the camera The model is always within the preset distance of the virtual object in the virtual environment.
  • the camera model does not actually display in the virtual environment, that is, the camera model cannot be recognized in the virtual environment displayed by the user interface.
  • the camera model is located at an arbitrary distance from the virtual object as an example.
  • one virtual object corresponds to one camera model, and the camera model can be rotated with the virtual object as a rotation center, such as: a virtual object
  • the camera model not only has an angle of rotation during the rotation, but also has an offset in the displacement, and the distance between the camera model and the center of rotation remains unchanged during rotation. That is, the camera model is rotated on the surface of the sphere with the center of rotation as the center of the sphere, wherein any point of the virtual object may be the head of the virtual object, the torso, or any point around the virtual object, which is used in this embodiment of the present application.
  • the viewing direction of the camera model is a direction in which the perpendicular line on the plane of the spherical surface of the camera model points to the virtual object.
  • the camera model can also observe the virtual object at a preset angle in different directions of the virtual object.
  • a point is determined in the virtual object 11 as a rotation center 12, and the camera model is rotated around the rotation center 12.
  • the camera model is configured with an initial position, which is a virtual object.
  • the position in the upper rear (such as the rear position of the brain).
  • the initial position is position 13.
  • the viewing angle of the camera model changes as the camera model rotates.
  • FIG. 2 is a structural block diagram of an electronic device provided by an exemplary embodiment of the present application.
  • the electronic device 200 includes an operating system 220 and an application 222.
  • Operating system 220 is the underlying software that provides application 222 with secure access to computer hardware.
  • Application 222 is an application that supports a virtual environment.
  • the application 222 is an application that supports a three-dimensional virtual environment.
  • the application 222 can be a virtual reality application, a three-dimensional map program, a military simulation program, a Third-Personal Shooting Game (TPS), a first-person shooting game (FPS), a MOBA game, Any of a variety of gun battle survival games.
  • the application 222 can be a stand-alone version of an application, such as a stand-alone version of a 3D game program.
  • FIG. 3 is a block diagram showing the structure of a computer system provided by an exemplary embodiment of the present application.
  • the computer system 300 includes a first device 320, a server 340, and a second device 360.
  • the first device 320 installs and runs an application that supports the virtual environment.
  • the application may be any one of a virtual reality application, a three-dimensional map program, a military simulation program, a TPS game, an FPS game, a MOBA game, and a multiplayer gun battle survival game.
  • the first device 320 is a device used by the first user, and the first user uses the first device 320 to control the first virtual object located in the virtual environment to perform activities, including but not limited to: adjusting body posture, crawling, walking, running, At least one of riding, jumping, driving, picking, shooting, attacking, and throwing.
  • the first virtual object is a first virtual character, such as a simulated persona or an anime character.
  • the first device 320 is connected to the server 340 via a wireless network or a wired network.
  • the server 340 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center.
  • Server 340 is used to provide background services for applications that support a three-dimensional virtual environment.
  • the server 340 undertakes the main computing work, the first device 320 and the second device 360 undertake the secondary computing work; or the server 340 undertakes the secondary computing work, and the first device 320 and the second device 360 undertake the main computing work;
  • the server 340, the first device 320, and the second device 360 employ a distributed computing architecture for collaborative computing.
  • the second device 360 installs and runs an application that supports the virtual environment.
  • the application may be any one of a virtual reality application, a three-dimensional map program, a military simulation program, an FPS game, a MOBA game, and a multiplayer gun battle survival game.
  • the second device 360 is a device used by the second user, and the second user uses the second device 360 to control the second virtual object located in the virtual environment to perform activities, including but not limited to: adjusting body posture, crawling, walking, running, At least one of riding, jumping, driving, picking, shooting, attacking, and throwing.
  • the second virtual object is a second virtual character, such as a simulated character or an anime character.
  • first avatar and the second avatar are in the same virtual environment.
  • first avatar and the second avatar may belong to the same team, the same organization, have a friend relationship, or have temporary communication rights.
  • first avatar and the second avatar may also belong to different teams, different organizations, or two groups that are hostile.
  • the applications installed on the first device 320 and the second device 360 are the same, or the applications installed on the two devices are the same type of applications of different control system platforms.
  • the first device 320 can be generally referred to as one of the plurality of devices
  • the second device 360 can be generally referred to as one of the plurality of devices. This embodiment is only illustrated by the first device 320 and the second device 360.
  • the device types of the first device 320 and the second device 360 are the same or different, and the device types include: a game console, a desktop computer, a smart phone, a tablet computer, an e-book reader, an MP3 player, an MP4 player, and a laptop portable device. At least one of the computers.
  • the following embodiments are illustrated by the device being a desktop computer.
  • the number of devices described above may be more or less.
  • the above devices may be only one, or the above devices are dozens or hundreds, or more.
  • the number of devices and the device type are not limited in the embodiment of the present application.
  • FIG. 4 is a method for performing perspective adjustment in a virtual environment according to an exemplary embodiment of the present application.
  • the method may be applied to the terminal or the server.
  • the method is applied to the terminal as an example.
  • the method for adjusting the angle of view in the virtual environment includes:
  • Step 401 Display a first view picture, which is a picture when the virtual object is viewed in the first view direction in the virtual environment.
  • the first view image includes a virtual object having a first orientation
  • the first view image is further displayed with a view adjustment control.
  • an application is installed in the terminal, and the application is an application supporting the virtual environment.
  • the application may be a virtual reality application, a TPS game, an FPS game, an MOBA game, or the like.
  • the view screen is a screen when the virtual object is viewed in the first view direction in the virtual environment of the application.
  • the first viewing angle may be an initial viewing direction when viewing the virtual object in the virtual environment.
  • the first viewing angle direction is a viewing angle direction when the virtual object is viewed from the rear of the virtual object.
  • the viewing angle adjustment control is superimposed and displayed on the first perspective image, which means that the viewing angle adjustment control is located on the upper layer of the first perspective image in the user interface, and the first perspective image is in the lower layer of the viewing angle adjustment control.
  • the first view image changes in the user interface, it does not affect the position or size of the view adjustment control in the user interface.
  • the user changes the size or position of the view adjustment control in the setting interface, the first does not affect the first The display of the view screen in the user interface.
  • a first view screen 52 is displayed in the user interface 51.
  • the first view screen includes a virtual object 53 having a first orientation, and the first orientation of the virtual object 53 is toward the virtual environment.
  • the angle of view adjustment control 54 is superimposed on the first angle view screen.
  • Step 402 Receive a drag instruction for the angle adjustment control.
  • the drag command may be generated by the user touching and dragging the angle adjustment control; when the terminal is a desktop computer or a portable knee
  • the drag operation may be generated by the user dragging the angle adjustment control through an external input device, for example, the user clicks the angle adjustment control and drags the mouse.
  • the viewing angle adjustment control is a control that superimposes the display on the perspective image
  • the terminal is a mobile phone or a tablet with a touch display.
  • the manner of dragging the viewing angle adjustment control includes the following manner. At least one of:
  • the user touches the angle adjustment control on the touch display screen, and directly drags the angle adjustment control;
  • the user touches the angle adjustment control on the touch display screen with one finger, and slides on the touch display screen through another finger, thereby implementing dragging the angle adjustment control;
  • the user touches the angle adjustment control on the touch display screen, and by pressing the physical button on the terminal, the drag of the angle adjustment control is realized.
  • the embodiment of the present application does not limit the manner in which the viewing angle adjustment control is dragged.
  • the viewing angle adjustment control may be any one of a circle, a square, a diamond, an ellipse, or an irregular graphic.
  • the viewing angle adjustment control may further adjust the control 54 as shown in FIG.
  • the pattern of the eye to indicate that the control is used to adjust the angle of view.
  • the size of the view adjustment control can be fixed or customized according to the needs of the user.
  • the view adjustment control may be superimposed and displayed on the right side of the view image or on the left side of the view image, and may be superimposed and displayed on the upper side of the view image or superimposed on the lower side of the view image. Not limited.
  • the view adjustment control may partially overlap with other view control in the user interface, or may set the view adjustment control to have no overlap with other controls in the user interface.
  • Step 403 Adjust the direction of the first viewing angle according to the drag instruction to obtain a second viewing angle direction.
  • the first viewing direction and the second viewing direction are directions for viewing the virtual environment in the virtual environment.
  • the manner of adjusting the direction of the first viewing angle according to the dragging instruction includes any one of the following manners:
  • the first viewing angle is a direction in which the virtual environment is observed by a camera model in the virtual environment, and the camera model is adjusted according to the dragging instruction;
  • the first viewing angle direction is a corresponding viewing angle direction when the environment image of the virtual environment is collected, and the environment screen of the virtual environment is adjusted according to the dragging instruction, and the screen outside the user interface is adjusted to the user according to the dragging instruction. Inside the interface.
  • Step 404 Display a second view picture, which is a picture when the virtual object is viewed in the second view direction in the virtual environment.
  • the second view image includes the virtual object having the first orientation.
  • the viewing angle adjustment control is also superimposed on the second perspective image.
  • the terminal receives the drag command for the angle adjustment control, and then the first Adjusting the viewing angle direction to obtain a second viewing angle direction, and displaying a second viewing angle screen 62 in the user interface 61, wherein the second viewing angle screen 62 includes the virtual object 53 having the first orientation, and the first orientation of the virtual object 53
  • the viewing angle adjustment control 54 is also superimposed on the second viewing angle screen 62.
  • the direction of the viewing angle is adjusted, and the viewing angle may be rotated in the virtual environment, or the viewing direction may be translated in the virtual environment.
  • the method for adjusting the angle of view in the virtual environment provides a perspective adjustment control, and when the user controls the angle adjustment control on the first user interface, the perspective of the virtual environment can be achieved.
  • the direction is adjusted without changing the orientation of the virtual object, that is, while the virtual object maintains the original orientation and maintains the original motion posture, the user can observe the other perspective directions of the virtual environment by controlling the angle adjustment control, that is, The user does not change the direction of travel of the virtual object in the virtual environment because of observing other directions of view, thereby avoiding the interaction between the view adjustment operation and the motion track of the virtual object.
  • the first view direction is a direction when the virtual object is observed by the camera model in the virtual environment
  • FIG. 7 is a virtual environment provided by another exemplary embodiment of the present application.
  • a method for adjusting the angle of view as shown in FIG. 7, the method for adjusting the angle of view in the virtual environment includes:
  • Step 701 Display a first view picture, which is a picture when the virtual object is viewed in the first view direction in the virtual environment.
  • the first view image includes a virtual object having a first orientation
  • the first view image is further displayed with a view adjustment control.
  • the first viewing direction is a direction when the virtual object is observed by the camera model in the virtual environment.
  • the first viewing direction is a direction in which the camera model observes the virtual object behind the virtual object in the virtual environment, that is, a direction from the rear of the virtual object to the virtual object.
  • Step 702 Receive a drag instruction for the angle adjustment control.
  • a locking operation may be performed on the first orientation of the virtual object, where the locking operation is used to control that the first orientation of the virtual object is not changed.
  • Step 703 determining a drag direction of the drag instruction.
  • the angle adjustment control can be dragged to any direction in the plane of the user interface.
  • FIG. 8 and FIG. 9 FIG. 8 and FIG. 9 .
  • the angle of view adjustment control 81 is dragged to the left by 0°, and the angle is increased by clockwise to 360°.
  • the angle of view adjustment control 91 is used. Drag to the left to 0°, increase the angle in a clockwise direction up to 180°, and decrement the angle counterclockwise until -180°.
  • Step 704 according to the drag instruction, the camera model is rotated by the virtual object, and the direction corresponding to the drag direction is the rotation direction, and the rotation is performed in the virtual environment.
  • the dragging instruction is further used to indicate a dragging distance, and the dragging distance of the dragging instruction is positively correlated with a rotation angle of the camera model during the rotating process.
  • the rotation direction of the camera model corresponds to the drag direction
  • the camera model is rotated on the spherical surface of the center of the rotation as an example.
  • the camera model is made on the spherical surface according to the drag command. Drag the circle corresponding to the direction and rotate the camera model clockwise or counterclockwise according to the drag command.
  • FIG. 10 is an example of a specific camera model rotating according to a drag instruction according to the embodiment.
  • the user interface 1010 includes a viewing angle adjustment control 1020 and has a first orientation.
  • the virtual object 1030 the user drags the viewing angle adjustment control 1020 to the right side.
  • the user can drag the viewing angle adjustment control 1020 to the 180° direction, because the 180° direction is in the virtual environment.
  • a horizontal circle is formed on the spherical surface of the camera model 1041, and the camera model 1041 is rotated clockwise on the circle according to the drag direction, and the rotation angle is determined according to the drag distance.
  • the drag distance is 1 unit
  • the camera model 1041 is rotated by 15° along the rotation angle.
  • the position where the camera model 1041 is rotated to the camera model 1042 is taken as an example.
  • Step 705 Determine a viewing angle direction of the rotated camera model to observe the virtual environment as a second viewing angle direction.
  • the direction in which the rotated camera model 1042 observes the virtual object 1030 is determined as the second viewing angle direction.
  • the direction of the viewing direction when the camera model is blocked is determined as the second viewing angle direction.
  • the possibility that the camera model has an obstacle during the rotation process includes at least one of the following cases:
  • the wear detection refers to whether the virtual object can observe the rotated view direction according to the physical principle in the virtual environment.
  • the direction of the viewing angle when the wear detection limit is determined is determined as the second viewing angle direction;
  • the wear detection refers to whether the virtual object can observe the rotated view direction when the camera model is not blocked by the obstacle during the rotation of the camera model, for example, the camera model is in the rotation process.
  • the window is rotated outside the window to observe the inside of the room.
  • the camera model is considered to be rotated during the process. Wear help to limit rotation.
  • the obstacle is a three-dimensional model in the virtual environment.
  • the camera model is blocked by the obstacle.
  • the direction of the viewing angle is determined to be the second viewing angle direction.
  • whether the camera model is blocked by an obstacle during the rotation process can be detected by performing collision detection in the virtual environment.
  • collision detection between the object and the object can be performed, for example, collision detection according to the rotation path of the camera model, and whether the camera model collides with the obstacle during the rotation, when the camera model is rotating When the process collides with an obstacle, the camera model is considered to be blocked by the obstacle.
  • Step 706 Display a second view picture, which is a picture when the virtual object is viewed in the second view direction in the virtual environment.
  • the second view image includes the virtual object having the first orientation.
  • the viewing angle adjustment control is also superimposed on the second perspective image.
  • the second perspective image when the rotated camera model 1042 observes the virtual object includes the virtual object 1030 having the first orientation, and the second perspective.
  • a viewing angle adjustment control 1020 is also superimposed on the screen.
  • the drag state may be a pause drag state, that is, the touch angle adjustment control 1020 is continuously touched, but the drag may be paused, or may be continuous dragging. Dynamic state.
  • the viewing angle adjustment control 1020 is displayed in a pan within a preset panning range (framed within a dashed circle of the view adjustment control), such as: when the drag direction is to the right, the view adjustment control is turned to the right. Pan until it reaches the edge of the virtual circle.
  • the method for adjusting the angle of view in the virtual environment provides a perspective adjustment control, and when the user controls the angle adjustment control on the first user interface, the perspective of the virtual environment can be achieved.
  • the direction is adjusted without changing the orientation of the virtual object, that is, while the virtual object maintains the original orientation and maintains the original motion posture, the user can observe the other perspective directions of the virtual environment by controlling the angle adjustment control, that is, The user does not change the direction in which the virtual object travels in the virtual environment because of observing other directions of view, thereby improving the convenience of the user to observe the virtual environment.
  • the method for adjusting the angle of view in the virtual environment provided by the embodiment, the virtual object in the virtual environment is observed by the camera model, and the orientation of the virtual object is separated from the observation direction of the virtual environment by the camera model, and the user can Rotate the camera model by controlling the angle adjustment control. Observing the other perspectives of the virtual environment improves the convenience of the user to observe the virtual environment.
  • a method for performing a view adjustment in a virtual environment where the method for adjusting a view angle in a virtual environment includes:
  • Step 1101 Display a first view picture, which is a picture when the virtual object is viewed in the first view direction in the virtual environment.
  • the first view image includes a virtual object having a first orientation
  • the first view image is further displayed with a view adjustment control.
  • the first viewing direction is a direction when the virtual object is observed by the camera model in the virtual environment.
  • the first viewing direction is a direction in which the camera model observes the virtual object behind the virtual object in the virtual environment, that is, a direction from the rear of the virtual object to the virtual object.
  • Step 1102 Receive a drag instruction for the angle adjustment control.
  • a locking operation may be performed on the first orientation of the virtual object, where the locking operation is used to control that the first orientation of the virtual object is not changed.
  • step 1103 the drag direction of the drag instruction is determined.
  • the user may drag the angle adjustment control to any direction in the plane of the user interface, and determine that the drag direction of the drag instruction has been specifically described in step 703. , will not repeat them here.
  • Step 1104 According to the drag instruction, the camera model is rotated by the virtual object, and the direction corresponding to the drag direction is the rotation direction, and the rotation is performed in the virtual environment.
  • the dragging instruction is further used to indicate a dragging distance, and the dragging distance of the dragging instruction is positively correlated with a rotation angle of the camera model during the rotating process.
  • the rotation direction of the camera model corresponds to the drag direction
  • the camera model is rotated on the spherical surface of the center of the rotation as an example.
  • the camera model is made on the spherical surface according to the drag command. Drag the circle corresponding to the direction and rotate the camera model clockwise or counterclockwise according to the drag command.
  • Step 1105 Determine a viewing angle direction of the rotated camera model to observe the virtual environment as a second viewing angle direction.
  • Step 1106 Display a second view picture, which is a picture when the virtual object is viewed in the second view direction in the virtual environment.
  • the second view image includes the virtual object having the first orientation.
  • the viewing angle adjustment control is also superimposed on the second perspective image.
  • Step 1107 detecting a drag instruction for the angle adjustment control.
  • the state of the drag instruction may include: the drag instruction continues, the drag instruction pauses, or the drag instruction stops.
  • the drag command continues to mean that the user continuously drags the view adjustment control, and the drag command pause refers to the user continuously touching (or pressing or clicking) the view adjustment control, but suspending the drag of the view adjustment control.
  • the drag command stops when the user stops touching (or pressing or clicking) the view adjustment control, and also stops dragging the view adjustment control.
  • Step 1108 When it is detected that the drag command is stopped, the second view angle is automatically adjusted to the first view direction along the preset track.
  • At least one of the following modes is included:
  • Second acquiring a shortest rotation path in which the second viewing angle direction is rotated to the first viewing angle direction, and rotating the second viewing angle direction to the first viewing angle direction according to the shortest rotating path.
  • adjusting the second viewing angle direction to the first viewing angle direction is to rotate the camera model from the second position to the first position.
  • the second position is a position when the camera model observes the virtual environment in the second viewing angle direction
  • the first position is a model when the camera model observes the virtual environment in the first viewing angle direction.
  • Step 1109 When the state of the virtual object is a standing state or a squat state or a squat state, the first perspective picture is displayed.
  • the state of the virtual object is a standing state or a kneeling state or a kneeling state
  • the position and orientation of the virtual object in the virtual environment are not changed, so that the second viewing angle is adjusted to the first viewing angle direction, and the direct display is performed.
  • the first perspective picture When the state of the virtual object is a standing state or a kneeling state or a kneeling state, the position and orientation of the virtual object in the virtual environment are not changed, so that the second viewing angle is adjusted to the first viewing angle direction, and the direct display is performed.
  • the first perspective picture is a standing state or a kneeling state or a kneeling state
  • step 1110 when the state of the virtual object is any one of continuous running, continuous walking, driving vehicle, and riding vehicle, the updated first perspective picture is displayed.
  • the updated first view picture is a picture updated by the movement of the virtual object in the virtual environment when the virtual object is viewed in the first view direction, and the updated first view picture includes the virtual object facing the first direction . Due to the movement of the virtual object in the virtual environment, the location of the virtual object in the virtual environment may change, so the first perspective image is updated as the location of the virtual object in the virtual environment changes.
  • the orientation of the virtual object in the virtual environment does not change, but the position of the virtual object in the virtual environment changes. Therefore, after the second viewing angle is adjusted to the first viewing angle direction, the updated first viewing angle image is displayed.
  • the method for adjusting the angle of view in the virtual environment provides a perspective adjustment control, and when the user controls the angle adjustment control on the first user interface, the perspective of the virtual environment can be achieved.
  • the direction is adjusted without changing the orientation of the virtual object, that is, while the virtual object maintains the original orientation and maintains the original motion posture, the user can observe the other perspective directions of the virtual environment by controlling the angle adjustment control, that is, The user does not change the direction in which the virtual object travels in the virtual environment because of observing other directions of view, thereby improving the convenience of the user to observe the virtual environment.
  • the method for adjusting the angle of view in the virtual environment provided by the embodiment, the virtual object in the virtual environment is observed by the camera model, and the orientation of the virtual object is separated from the observation direction of the virtual environment by the camera model, and the user can Rotate the camera model by controlling the angle adjustment control. Observing the other perspectives of the virtual environment improves the convenience of the user to observe the virtual environment.
  • the method for adjusting the angle of view in the virtual environment when detecting that the drag command is stopped, automatically adjusting the second view direction along the preset track to the first view direction, and the user can stop the view adjustment control by stopping Drag or stop to touch the angle adjustment control to restore the angle of view to the first angle of view, which improves the user's convenience in viewing the virtual environment.
  • FIG. 12 is a flowchart of a method for performing perspective adjustment in a virtual environment according to another exemplary embodiment of the present application, where the method for performing perspective adjustment in a virtual environment includes:
  • step 1201 a viewing angle adjustment control is generated.
  • the terminal determines whether the view adjustment control in the application is turned on.
  • the view adjustment control is generated and superimposed and displayed on the view screen.
  • step 1202 it is determined whether the angle adjustment control is pressed.
  • Step 1203 When the viewing angle adjustment control is pressed, the orientation of the virtual object is locked, and the drag direction is determined.
  • the angle adjustment control When the angle adjustment control is pressed, it indicates that the user needs to observe other angles of the virtual environment, but if the virtual object needs to maintain the existing orientation, the orientation of the virtual object is locked, and the drag direction of the drag operation is determined.
  • step 1204 the camera model faces the virtual object and follows the virtual object.
  • the camera model always maintains an angle of view of the virtual object and follows the virtual object, ie, when the virtual object is in motion, the camera model follows the virtual object and rotates around the virtual object.
  • step 1205 the camera model rotates in a direction corresponding to the drag direction.
  • step 1206 after the dragging of the viewing angle adjustment control is stopped, the viewing direction of the camera model is restored.
  • the viewing direction of the camera model is restored to the viewing direction before the viewing angle adjustment control is pressed.
  • step 1207 when a non-meaningful perspective is encountered, the camera automatically follows and displays reasonable content.
  • the rotation of the camera model it is continuously detected whether the angle of view obtained by the rotation is a meaningless perspective, which is a perspective that cannot be achieved according to physical principles.
  • a meaningless perspective the perspective of the control camera model does not continue to rotate toward the meaningless perspective.
  • the meaningless viewing angle may be a viewing angle of the camera model rotated into the wall, or a perspective of the camera model rotating below the ground, or a perspective of the camera model rotating into a virtual environment within the window to observe the room, and according to the physics In principle, the virtual object cannot observe the virtual environment in the room at the location.
  • Step 1301 generating a viewing angle adjustment control.
  • step 1302 it is determined whether to drag the view area or drag the view adjustment control.
  • step 1303 when the viewing angle adjustment control is dragged, the camera model is independent of the orientation of the virtual object.
  • step 1304 the movement of the virtual object does not change.
  • step 1305 when the view area is dragged, the camera model is synchronized with the orientation of the virtual object, and the rear view angle is maintained.
  • the viewing angle area is dragged, that is, the observation angle of the camera model is synchronized with the orientation of the user.
  • the viewing angle is adjusted, the user performs a turning operation with the rotation of the camera model, and the camera model maintains a rear-end perspective on the virtual object.
  • the viewing angle area may be a left half or a right half of the user interface, or may be a preset area;
  • FIG. 14 is a structural block diagram of an apparatus for performing viewing angle adjustment in a virtual environment according to an exemplary embodiment of the present application.
  • the apparatus may be disposed in a terminal, where the apparatus for performing viewing angle adjustment in a virtual environment includes: a display module 1410, Receiving module 1420, adjusting module 1430;
  • the display module 1410 is configured to display a first view picture, where the first view picture is a picture when the virtual object is viewed in the first view direction in the virtual environment, and the first view picture includes the virtual object having the first orientation, the first view
  • the viewing angle adjustment control is also superimposed on the screen;
  • the receiving module 1420 is configured to receive a drag instruction for the viewing angle adjustment control
  • the adjusting module 1430 is configured to adjust the direction of the first viewing angle according to the dragging instruction to obtain a second viewing angle direction;
  • the display module 1410 is further configured to display a second view picture, where the virtual view is viewed in the second view direction in the virtual environment, and the second view picture includes the virtual object having the first orientation.
  • the first viewing direction is a direction when the virtual object is observed by the camera model in the virtual environment
  • the adjustment module 1430 includes:
  • a determining unit 1431 configured to determine a drag direction of the drag instruction
  • the adjusting unit 1432 is configured to rotate the camera model with the virtual object as a rotation center according to the drag instruction, and rotate the virtual environment in a direction corresponding to the drag direction;
  • the determining unit 1431 is further configured to determine a direction in which the rotated camera model observes the virtual object as a second viewing angle direction.
  • the determining unit 1431 is further configured to determine a viewing angle direction when the camera model is blocked as the second viewing angle direction when detecting that the camera model has an obstacle during the rotating process.
  • the direction of the viewing angle when the obstacle is obstructed is the viewing direction when the detection is restricted by the wearing.
  • the detecting of the wearing refers to whether the virtual object can observe the direction of the rotated viewing angle according to the physical principle in the virtual environment. Detection
  • the determining unit 1431 is further configured to determine a viewing angle direction when the wear detection limit is determined as the second viewing angle direction when detecting that the camera model is restricted from being rotated by the wear detection during the rotation.
  • the viewing direction when the camera model is obstructed is the viewing direction when the camera model is blocked by the obstacle
  • the determining unit 1431 is further configured to determine a viewing angle direction when the camera model is blocked by the obstacle as the second viewing angle direction when detecting that the camera model has an obstacle blocking during the rotation.
  • the device further includes:
  • the locking module 1440 is configured to perform a locking operation on the first orientation of the virtual object, and the locking operation is used to control that the first orientation of the virtual object is not changed.
  • the device further includes:
  • the detecting module 1450 is configured to detect a drag instruction for the viewing angle adjustment control
  • the adjusting module 1430 is further configured to automatically adjust the second viewing angle direction to the first viewing angle direction along the preset trajectory when detecting that the dragging instruction is stopped;
  • the display module 1410 is further configured to display the first view picture when the state of the virtual object is a standing state or a kneeling state or a kneeling state.
  • the device further includes:
  • a detecting module 1450 configured to detect a control signal for the viewing angle adjustment control
  • the adjusting module 1430 is further configured to adjust the second viewing angle direction to the first viewing angle direction along the preset trajectory when detecting that the control signal is stopped;
  • the display module 1410 is further configured to: when the state of the virtual object is any one of continuous running, continuous walking, driving vehicle, and riding vehicle, displaying the updated first perspective image, and the updated first perspective image is When the virtual object is viewed in the first view direction, the updated view is displayed due to the movement of the virtual object in the virtual environment, and the updated first view image includes the virtual object facing the first direction.
  • the display module 1410 is further configured to display a third view picture, where the third view picture is a virtual object when viewing the virtual view by changing the first view direction to the second view direction.
  • the angle adjustment control superimposed and displayed on the third angle view screen is panned and displayed in the drag direction within the preset pan range.
  • FIG. 16 is a block diagram showing the structure of a terminal 1600 provided by an exemplary embodiment of the present invention.
  • the terminal 1600 can be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III), and an MP4 (Moving Picture Experts Group Audio Layer IV). Level 4) Player, laptop or desktop computer.
  • Terminal 1600 may also be referred to as a user device, a portable terminal, a laptop terminal, a desktop terminal, and the like.
  • the terminal 1600 includes a processor 1601 and a memory 1602.
  • the processor 1601 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like.
  • the processor 1601 may be configured by at least one of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). achieve.
  • the processor 1601 may also include a main processor and a coprocessor.
  • the main processor is a processor for processing data in an awake state, which is also called a CPU (Central Processing Unit); the coprocessor is A low-power processor for processing data in standby.
  • the processor 1601 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and rendering of the content that the display needs to display.
  • the processor 1601 may further include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
  • AI Artificial Intelligence
  • Memory 1602 can include one or more computer readable storage media, which can be non-transitory.
  • the memory 1602 can also include high speed random access memory, as well as non-volatile memory such as one or more disk storage devices, flash storage devices.
  • the non-transitory computer readable storage medium in memory 1602 is for storing at least one instruction for execution by processor 1601 to implement the virtual method provided by the method embodiments of the present application. A method of adjusting the angle of view in the environment.
  • the terminal 1600 optionally further includes: a peripheral device interface 1603 and at least one peripheral device.
  • the processor 1601, the memory 1602, and the peripheral device interface 1603 may be connected by a bus or a signal line.
  • Each peripheral device can be connected to the peripheral device interface 1603 via a bus, signal line or circuit board.
  • the peripheral device includes at least one of a radio frequency circuit 1604, a touch display screen 1605, a camera 1606, an audio circuit 1607, a positioning component 1608, and a power source 1609.
  • Peripheral device interface 1603 can be used to connect at least one peripheral device associated with I/O (Input/Output) to processor 1601 and memory 1602.
  • the processor 1601, the memory 1602, and the peripheral device interface 1603 are integrated on the same chip or circuit board; in some other embodiments, any one of the processor 1601, the memory 1602, and the peripheral device interface 1603 or The two can be implemented on a separate chip or circuit board, which is not limited in this embodiment.
  • the RF circuit 1604 is configured to receive and transmit an RF (Radio Frequency) signal, also referred to as an electromagnetic signal.
  • Radio frequency circuit 1604 communicates with the communication network and other communication devices via electromagnetic signals.
  • the RF circuit 1604 converts the electrical signal into an electromagnetic signal for transmission, or converts the received electromagnetic signal into an electrical signal.
  • the radio frequency circuit 1604 includes an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and the like.
  • the radio frequency circuit 1604 can communicate with other terminals via at least one wireless communication protocol.
  • the wireless communication protocols include, but are not limited to, the World Wide Web, a metropolitan area network, an intranet, generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity) networks.
  • the RF circuit 1604 may also include NFC (Near Field Communication) related circuitry, which is not limited in this application.
  • the display 1605 is used to display a UI (User Interface).
  • the UI can include graphics, text, icons, video, and any combination thereof.
  • the display 1605 also has the ability to capture touch signals over the surface or surface of the display 1605.
  • the touch signal can be input to the processor 1601 as a control signal for processing.
  • the display 1605 can also be used to provide virtual buttons and/or virtual keyboards, also referred to as soft buttons and/or soft keyboards.
  • the display screen 1605 can be one, and the front panel of the terminal 1600 is disposed; in other embodiments, the display screen 1605 can be at least two, respectively disposed on different surfaces of the terminal 1600 or in a folded design; In still other embodiments, display screen 1605 can be a flexible display screen disposed on a curved surface or a folded surface of terminal 1600. Even the display screen 1605 can be set to a non-rectangular irregular pattern, that is, a profiled screen.
  • the display 1605 can be made of a material such as an LCD (Liquid Crystal Display) or an OLED (Organic Light-Emitting Diode).
  • Camera component 1606 is used to capture images or video.
  • camera assembly 1606 includes a front camera and a rear camera.
  • the front camera is placed on the front panel of the terminal, and the rear camera is placed on the back of the terminal.
  • the rear camera is at least two, which are respectively a main camera, a depth camera, a wide-angle camera, and a telephoto camera, so as to realize the background blur function of the main camera and the depth camera, and the main camera Combine with a wide-angle camera for panoramic shooting and VR (Virtual Reality) shooting or other integrated shooting functions.
  • camera assembly 1606 can also include a flash.
  • the flash can be a monochrome temperature flash or a two-color temperature flash.
  • the two-color temperature flash is a combination of a warm flash and a cool flash that can be used for light compensation at different color temperatures.
  • the audio circuit 1607 can include a microphone and a speaker.
  • the microphone is used to collect sound waves of the user and the environment, and convert the sound waves into electrical signals for processing into the processor 1601 for processing, or input to the RF circuit 1604 for voice communication.
  • the microphones may be multiple, and are respectively disposed at different parts of the terminal 1600.
  • the microphone can also be an array microphone or an omnidirectional acquisition microphone.
  • the speaker is then used to convert electrical signals from the processor 1601 or the RF circuit 1604 into sound waves.
  • the speaker can be a conventional film speaker or a piezoelectric ceramic speaker.
  • the audio circuit 1607 can also include a headphone jack.
  • the location component 1608 is used to locate the current geographic location of the terminal 1600 to implement navigation or LBS (Location Based Service).
  • the positioning component 1608 can be a positioning component based on a US-based GPS (Global Positioning System), a Chinese Beidou system, or a Russian Galileo system.
  • a power supply 1609 is used to power various components in the terminal 1600.
  • the power source 1609 can be an alternating current, a direct current, a disposable battery, or a rechargeable battery.
  • the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery.
  • a wired rechargeable battery is a battery that is charged by a wired line
  • a wireless rechargeable battery is a battery that is charged by a wireless coil.
  • the rechargeable battery can also be used to support fast charging technology.
  • terminal 1600 also includes one or more sensors 1610.
  • the one or more sensors 1610 include, but are not limited to, an acceleration sensor 1611, a gyro sensor 1612, a pressure sensor 1613, a fingerprint sensor 1614, an optical sensor 1615, and a proximity sensor 1616.
  • the acceleration sensor 1611 can detect the magnitude of the acceleration on the three coordinate axes of the coordinate system established by the terminal 1600.
  • the acceleration sensor 1611 can be used to detect components of gravity acceleration on three coordinate axes.
  • the processor 1601 can control the touch display screen 1605 to display the user interface in a landscape view or a portrait view according to the gravity acceleration signal collected by the acceleration sensor 1611.
  • the acceleration sensor 1611 can also be used for the acquisition of game or user motion data.
  • the gyro sensor 1612 can detect the body direction and the rotation angle of the terminal 1600, and the gyro sensor 1612 can cooperate with the acceleration sensor 1611 to collect the 3D motion of the user to the terminal 1600. Based on the data collected by the gyro sensor 1612, the processor 1601 can implement functions such as motion sensing (such as changing the UI according to the user's tilting operation), image stabilization at the time of shooting, game control, and inertial navigation.
  • functions such as motion sensing (such as changing the UI according to the user's tilting operation), image stabilization at the time of shooting, game control, and inertial navigation.
  • the pressure sensor 1613 can be disposed on a side border of the terminal 1600 and/or a lower layer of the touch display screen 1605.
  • the pressure sensor 1613 When the pressure sensor 1613 is disposed at the side frame of the terminal 1600, the user's holding signal to the terminal 1600 can be detected, and the processor 1601 performs left and right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1613.
  • the operability control on the UI interface is controlled by the processor 1601 according to the user's pressure operation on the touch display screen 1605.
  • the operability control includes at least one of a button control, a scroll bar control, an icon control, and a menu control.
  • the fingerprint sensor 1614 is configured to collect the fingerprint of the user, and the processor 1601 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 1614, or the fingerprint sensor 1614 identifies the identity of the user according to the collected fingerprint. Upon identifying that the user's identity is a trusted identity, the processor 1601 authorizes the user to perform related sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying and changing settings, and the like. Fingerprint sensor 1614 can be placed on the front, back or side of terminal 1600. When the physical button or vendor logo is provided on the terminal 1600, the fingerprint sensor 1614 can be integrated with the physical button or the manufacturer logo.
  • Optical sensor 1615 is used to collect ambient light intensity.
  • the processor 1601 can control the display brightness of the touch display 1605 based on the ambient light intensity acquired by the optical sensor 1615. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1605 is raised; when the ambient light intensity is low, the display brightness of the touch display screen 1605 is lowered.
  • the processor 1601 can also dynamically adjust the shooting parameters of the camera assembly 1606 based on the ambient light intensity acquired by the optical sensor 1615.
  • Proximity sensor 1616 also referred to as a distance sensor, is typically disposed on the front panel of terminal 1600. Proximity sensor 1616 is used to capture the distance between the user and the front of terminal 1600. In one embodiment, when the proximity sensor 1616 detects that the distance between the user and the front side of the terminal 1600 is gradually decreasing, the touch panel 1605 is controlled by the processor 1601 to switch from the bright screen state to the screen state; when the proximity sensor 1616 detects When the distance between the user and the front side of the terminal 1600 gradually becomes larger, the processor 1601 controls the touch display screen 1605 to switch from the state of the screen to the bright state.
  • FIG. 16 does not constitute a limitation to terminal 1600, may include more or fewer components than illustrated, or may combine certain components, or employ different component arrangements.
  • the medium may be a computer readable storage medium included in the memory in the above embodiment; or may be a computer readable storage medium that is separately present and not incorporated in the terminal.
  • the computer readable storage medium stores at least one instruction, at least one program, code set or instruction set, and the at least one instruction, the at least one program, the code set or the instruction set is loaded and executed by the processor A method of performing viewing angle adjustment in a virtual environment as described in any of FIGS. 1 to 10 is implemented.
  • the computer readable storage medium may include a Read Only Memory (ROM), a Random Access Memory (RAM), a Solid State Drive (SSD), or an optical disc.
  • the random access memory may include a Resistance Random Access Memory (ReRAM) and a Dynamic Random Access Memory (DRAM).
  • ReRAM Resistance Random Access Memory
  • DRAM Dynamic Random Access Memory
  • a person skilled in the art may understand that all or part of the steps of implementing the above embodiments may be completed by hardware, or may be instructed by a program to execute related hardware, and the program may be stored in a computer readable storage medium.
  • the storage medium mentioned may be a read only memory, a magnetic disk or an optical disk or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种在虚拟环境中进行视角调整的方法,包括:显示第一视角画面(52),第一视角画面(52)中包括具有第一朝向的虚拟对象(53);接收对视角调整控件的拖动指令;根据拖动指令对第一视角方向进行调整,得到第二视角方向;显示第二视角画面(62),第二视角画面(62)中包括具有第一朝向的虚拟对象(53)。还提供了一种在虚拟环境中进行视角调整的装置、终端及计算机可读存储介质。通过设置视角调整控件,当用户在第一用户界面上对视角调整控件进行控制时,可以达到对观察虚拟环境的视角方向进行调整而不改变虚拟对象的朝向的效果,即在虚拟对象维持原有朝向的同时,用户可以通过控制视角调整控件对虚拟环境的其他视角方向进行观察,避免了视角调整操作与虚拟对象的运动轨迹之间相互影响。

Description

在虚拟环境中进行视角调整的方法、装置及可读存储介质
本申请要求于2018年04月16日提交的申请号为201810339749.7、发明名称为“在虚拟环境中进行视角调整的方法、装置及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及计算机图形处理领域,特别涉及一种在虚拟环境中进行视角调整的方法、装置及可读存储介质。
背景技术
在具有虚拟环境的应用程序,如:虚拟现实应用程序、第三人称射击游戏(Third-Personal Shooting Game,TPS)、第一人称射击游戏(First-person shooting game,FPS)、多人在线战术竞技游戏(Multiplayer Online Battle Arena Games,MOBA)中,虚拟对象、虚拟物体、地面等显示元素是使用立体模型来实现效果的。用户可以通过拖动操作对虚拟对象在虚拟环境中的视角进行旋转。
通常,在上述应用程序的用户界面中,用户可以通过左手控制摇杆控件以及在右侧屏幕进行滑动对虚拟对象进行控制,其中,摇杆控件用于控制虚拟对象在朝向不变的情况下进行前进、后退、左移或者右移等操作,而在右侧屏幕用于对虚拟对象在虚拟环境中面朝的方向(也即视角方向)进行旋转。
然而,通过上述方式对虚拟对象的视角进行旋转时,同时也改变了虚拟对象的状态,即在旋转视角的过程中虚拟对象的模型也需要在虚拟环境中同时进行转动,也即虚拟对象的朝向也进行旋转,则虚拟对象在行走或者奔跑状态下时,旋转视角会改变虚拟对象行走或者奔跑的方向,造成用户对虚拟环境的观察时操作较为不便。
发明内容
本申请实施例提供了一种在虚拟环境中进行视角调整的方法、装置及可读存储介质,可以解决调整视角会改变虚拟对象行走或者奔跑的方向,各操作之间相互产生影响的问题。所述技术方案如下:
一方面,提供了一种在虚拟环境中进行视角调整的方法,应用于终端中,该方法包括:
显示第一视角画面,第一视角画面是在虚拟环境中以第一视角方向观察虚拟对象时的画面,第一视角画面中包括具有第一朝向的虚拟对象,第一视角画面上还叠加显示有视角调整控件;
接收对视角调整控件的拖动指令;
根据拖动指令对第一视角方向进行调整,得到第二视角方向;
显示第二视角画面,第二视角画面是在虚拟环境中以第二视角方向观察虚拟对象时的画面,第二视角画面中包括具有第一朝向的虚拟对象。
另一方面,提供了一种在虚拟环境中进行视角调整的装置,该装置包括:
显示模块,用于显示第一视角画面,第一视角画面是在虚拟环境中以第一视角方向观察虚拟对象时的画面,第一视角画面中包括具有第一朝向的虚拟对象,第一视角画面上还叠加显示有视角调整控件;
接收模块,用于接收对视角调整控件的拖动指令;
调整模块,用于根据拖动指令对第一视角方向进行调整,得到第二视角方向;
显示模块,还用于显示第二视角画面,第二视角画面是在虚拟环境中以第二视角方向观察虚拟对象时的画面,第二视角画面中包括具有第一朝向的虚拟对象。
另一方面,提供了终端,所述终端包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如上述本申请实施例所述的在虚拟环境中进行视角调整的方法。
另一方面,提供了一种计算机可读存储介质,所述计算机可读存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如上述本申请实施例所述的在虚拟环境中进行视角调整的方法。
另一方面,提供了一种计算机程序产品,当所述计算机程序产品在计算机上运行时,使得计算机执行如上述本申请实施例所述的在虚拟环境中进行视角调整的方法。
本申请实施例提供的技术方案带来的有益效果至少包括:
通过设置视角调整控件,当用户在第一用户界面上对视角调整控件进行控制时,可以达到对观察虚拟环境的视角方向进行调整而不改变虚拟对象的朝向的效果,即在虚拟对象维持原有的人物朝向,保持原有运动姿态的同时,用户可以通过控制视角调整控件对虚拟环境的其他视角方向进行观察,也即,用户不会因为对其他视角方向进行观察而改变虚拟对象在虚拟环境中的行进方向,提高了用户对虚拟环境进行观察时的便捷程度,避免了视角调整操作与虚拟对象的运动轨迹之间相互影响。
附图说明
图1是本申请一个示例性实施例提供的摄像机模型的示意图;
图2是本申请一个示例性实施例提供的电子设备的结构框图;
图3是本申请一个示例性实施例提供的计算机系统的结构框图;
图4是本申请一个示例性实施例提供的在虚拟环境中进行视角调整的方法流程图;
图5是基于图4示出的实施例提供的用户界面的示意图;
图6是基于图4示出的实施例提供的视角调整过程中的用户界面的示意图;
图7是本申请另一个示例性实施例提供的在虚拟环境中进行视角调整的方法流程图;
图8是基于图7示出的实施例提供的拖动方向的示意图;
图9是基于图7示出的实施例提供的另一个拖动方向的示意图;
图10是是基于图7示出的实施例提供的视角画面与拖动方向的对应关系示意图;
图11是本申请另一个示例性实施例提供的在虚拟环境中进行视角调整的方法流程图;
图12是本申请另一个示例性实施例提供的在虚拟环境中进行视角调整的方法流程图;
图13是本申请另一个示例性实施例提供的在虚拟环境中进行视角调整的方法流程图;
图14是本申请一个示例性实施例提供的在虚拟环境中进行视角调整的装置的结构框图;
图15是本申请另一个示例性实施例提供的在虚拟环境中进行视角调整的装置的结构框图;
图16是本申请一个示例性实施例提供的终端的结构框图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请实施方式作进一步地详细描述。
首先,对本申请实施例涉及的名词进行简单介绍:
虚拟环境:是指应用程序在终端上运行时显示(或提供)的虚拟环境。该虚拟环境可以是对真实世界的仿真环境,也可以是半仿真半虚构的三维环境,还可以是纯虚构的三维环境。虚拟环境可以是二维虚拟环境、2.5维虚拟环境和三维虚拟环境中的任意一种,下述实施例以虚拟环境是三维虚拟环境来举例说明,但对此不加以限定。可选地,该虚拟环境还用于进行至少两个虚拟角色之间的虚拟环境对战。
虚拟对象:是指在虚拟环境中的可活动对象。该可活动对象包括虚拟人物、虚拟动物、动漫人物中的至少一种。可选地,当虚拟环境为三维虚拟环境时,虚拟对象是基于动画骨骼技术创建的三维立体模型。每个虚拟对象在三维虚拟环境中具有自身的形状和体积,占据三维虚拟环境中的一部分空间。
视角方向:是指以虚拟对象的第一人称视角或第三人称视角在虚拟环境中进行观察时的观察方向。可选地,本申请实施例中,该视角方向是指在虚拟环境中以第三人称视角对虚拟对象进行观察的方向。可选地,该视角方向是在虚拟环境中通过摄像机模型对虚拟对象进行观察时的方向。
摄像机模型:是在虚拟环境中位于虚拟对象周围的模型,当采用第一人称视角时,该摄像机模型位于虚拟对象的头部附近或者位于虚拟对象的头部,当采用第三人称视角时,该摄像机模型可以位于虚拟对象的后方并与虚拟对象进行绑定,也可以位于与虚拟对象相距预设距离的任意位置,通过该摄像机模型可以从不同角度对位于虚拟环境中的虚拟对象进行观察,可选地,该第三人称视角为第一人称的过肩视角时,摄像机模型位于虚拟对象(比如虚拟人物的头肩部)的后方。可选地,摄像机模型在虚拟环境中对虚拟对象进行自动跟随,即,当虚拟对象在虚拟环境中的位置发生改变时,摄像机模型跟随虚拟对象在 虚拟环境中的位置同时发生改变,且该摄像机模型在虚拟环境中始终处于虚拟对象的预设距离范围内。可选地,该摄像机模型在虚拟环境中不会进行实际显示,即,在用户界面显示的虚拟环境中无法识别到该摄像机模型。
对该摄像机模型位于与虚拟对象相距预设距离的任意位置为例进行说明,可选地,一个虚拟对象对应一个摄像机模型,该摄像机模型可以以虚拟对象为旋转中心进行旋转,如:以虚拟对象的任意一点为旋转中心对摄像机模型进行旋转,摄像机模型在旋转过程中的不仅在角度上有转动,还在位移上有偏移,旋转时摄像机模型与该旋转中心之间的距离保持不变,即,将摄像机模型在以该旋转中心作为球心的球体表面进行旋转,其中,虚拟对象的任意一点可以是虚拟对象的头部、躯干、或者虚拟对象周围的任意一点,本申请实施例对此不加以限定。可选地,摄像机模型在对虚拟对象进行观察时,该摄像机模型的视角方向为该摄像机模型所在球面的切面上的垂线指向虚拟对象的方向。
可选地,该摄像机模型还可以在虚拟对象的不同方向以预设的角度对虚拟对象进行观察。
示意性的,请参考图1,在虚拟对象11中确定一点作为旋转中心12,摄像机模型围绕该旋转中心12进行旋转,可选地,该摄像机模型配置有一个初始位置,该初始位置为虚拟对象后上方的位置(比如脑部的后方位置)。示意性的,如图1所示,该初始位置为位置13,当摄像机模型旋转至位置14或者位置15时,摄像机模型的视角方向随摄像机模型的转动而进行改变。
图2示出了本申请一个示例性实施例提供的电子设备的结构框图。该电子设备200包括:操作系统220和应用程序222。
操作系统220是为应用程序222提供对计算机硬件的安全访问的基础软件。
应用程序222是支持虚拟环境的应用程序。可选地,应用程序222是支持三维虚拟环境的应用程序。该应用程序222可以是虚拟现实应用程序、三维地图程序、军事仿真程序、第三人称射击游戏(Third-Personal Shooting Game,TPS)、第一人称射击游戏(First-person shooting game,FPS)、MOBA游戏、多人枪战类生存游戏中的任意一种。该应用程序222可以是单机版的应用程序,比如单机版的3D游戏程序。
图3示出了本申请一个示例性实施例提供的计算机系统的结构框图。该计 算机系统300包括:第一设备320、服务器340和第二设备360。
第一设备320安装和运行有支持虚拟环境的应用程序。该应用程序可以是虚拟现实应用程序、三维地图程序、军事仿真程序、TPS游戏、FPS游戏、MOBA游戏、多人枪战类生存游戏中的任意一种。第一设备320是第一用户使用的设备,第一用户使用第一设备320控制位于虚拟环境中的第一虚拟对象进行活动,该活动包括但不限于:调整身体姿态、爬行、步行、奔跑、骑行、跳跃、驾驶、拾取、射击、攻击、投掷中的至少一种。示意性的,第一虚拟对象是第一虚拟人物,比如仿真人物角色或动漫人物角色。
第一设备320通过无线网络或有线网络与服务器340相连。
服务器340包括一台服务器、多台服务器、云计算平台和虚拟化中心中的至少一种。服务器340用于为支持三维虚拟环境的应用程序提供后台服务。可选地,服务器340承担主要计算工作,第一设备320和第二设备360承担次要计算工作;或者,服务器340承担次要计算工作,第一设备320和第二设备360承担主要计算工作;或者,服务器340、第一设备320和第二设备360三者之间采用分布式计算架构进行协同计算。
第二设备360安装和运行有支持虚拟环境的应用程序。该应用程序可以是虚拟现实应用程序、三维地图程序、军事仿真程序、FPS游戏、MOBA游戏、多人枪战类生存游戏中的任意一种。第二设备360是第二用户使用的设备,第二用户使用第二设备360控制位于虚拟环境中的第二虚拟对象进行活动,该活动包括但不限于:调整身体姿态、爬行、步行、奔跑、骑行、跳跃、驾驶、拾取、射击、攻击、投掷中的至少一种。示意性的,第二虚拟对象是第二虚拟人物,比如仿真人物角色或动漫人物角色。
可选地,第一虚拟人物和第二虚拟人物处于同一虚拟环境中。可选地,第一虚拟人物和第二虚拟人物可以属于同一个队伍、同一个组织、具有好友关系或具有临时性的通讯权限。可选地,第一虚拟人物和第二虚拟人物也可以属于不同队伍、不同组织、或具有敌对性的两个团体。
可选地,第一设备320和第二设备360上安装的应用程序是相同的,或两个设备上安装的应用程序是不同控制系统平台的同一类型应用程序。第一设备320可以泛指多个设备中的一个,第二设备360可以泛指多个设备中的一个,本实施例仅以第一设备320和第二设备360来举例说明。第一设备320和第二设备360的设备类型相同或不同,该设备类型包括:游戏主机、台式计算机、智 能手机、平板电脑、电子书阅读器、MP3播放器、MP4播放器和膝上型便携计算机中的至少一种。以下实施例以设备是台式计算机来举例说明。
本领域技术人员可以知晓,上述设备的数量可以更多或更少。比如上述设备可以仅为一个,或者上述设备为几十个或几百个,或者更多数量。本申请实施例对设备的数量和设备类型不加以限定。
结合上述名词解释,对本申请实施例提供的在虚拟环境中进行视角调整的方法进行说明,请参考图4,图4是本申请一个示例性的实施例提供的在虚拟环境中进行视角调整的方法流程图,该方法可应用于终端中或者服务器中,本实施例中以该方法应用于终端中为例进行说明,该在虚拟环境中进行视角调整的方法包括:
步骤401,显示第一视角画面,该第一视角画面是在虚拟环境中以第一视角方向观察虚拟对象时的画面。
可选地,该第一视角画面中包括具有第一朝向的虚拟对象,该第一视角画面上还叠加显示有视角调整控件。
可选地,终端中安装有应用程序,该应用程序为支持上述虚拟环境的应用程序,可选地,该应用程序可以是虚拟现实应用程序、TPS游戏、FPS游戏、MOBA游戏等,该第一视角画面为在该应用程序的虚拟环境中以第一视角方向观察虚拟对象时的画面。
可选地,该第一视角方向可以是在该虚拟环境中观察虚拟对象时的初始视角方向。可选地,该第一视角方向是从虚拟对象的后方对虚拟对象进行观察时的视角方向。
可选地,视角调整控件叠加显示在第一视角画面之上,是指视角调整控件在用户界面中位于第一视角画面的上层,第一视角画面在视角调整控件的下层。当第一视角画面在用户界面中发生变化时,不影响视角调整控件在用户界面中的位置或者大小,当用户在设置界面中对视角调整控件的大小或者位置进行改变后,也不影响第一视角画面在用户界面中的显示。
示意性的,请参考图5,在用户界面51中显示第一视角画面52,该第一视角画面中包括具有第一朝向的虚拟对象53,该虚拟对象53的第一朝向为朝向该虚拟环境中的门口55,该第一视角画面上还叠加显示有视角调整控件54。
步骤402,接收对视角调整控件的拖动指令。
可选地,当该终端为带有触摸显示屏的手机或平板时,该拖动指令可以是用户通过对该视角调整控件进行触摸并拖动而产生的;当该终端为台式电脑、便携式膝上电脑时,该拖动操作可以是用户通过外部输入设备对该视角调整控件进行拖动产生的,如:用户通过鼠标点击该视角调整控件并进行拖动。
可选地,该视角调整控件为叠加显示在视角画面之上的控件,以该终端为带有触摸显示屏的手机或者平板为例进行说明,对该视角调整控件进行拖动的方式包括如下方式中的至少一种:
第一,用户在触摸显示屏上对该视角调整控件进行触摸,并对该视角调整控件直接进行拖动;
第二,用户通过一个手指在触摸显示屏上对视角调整控件进行触摸,并通过另一个手指在触摸显示屏上进行滑动,从而实现对视角调整控件进行拖动;
第三,用户通过在触摸显示屏上对视角调整控件进行触摸,并通过按下终端上的物理按键,实现对视角调整控件的拖动。
本申请实施例对视角调整控件的拖动方式不做限定。
可选地,该视角调整控件可以是圆形、方形、菱形、椭圆形、或者不规则图形中的任意一种,该视角调整控件还可以如图5中显示的视角调整控件54,显示有小眼睛的图案,以表示该控件用于对视角进行调整。该视角调整控件的大小可以是固定的,也可以根据用户的需要进行自定义设置。
可选地,该视角调整控件可以叠加显示在视角画面的右侧或者视角画面的左侧,可以叠加显示在视角画面的上侧也可以叠加显示在视角画面的下侧,本申请实施例对此不加以限定。
可选地,该视角调整控件在用户界面中可以与其他视角控件有部分重叠,也可以设置该视角调整控件在用户界面中与其他控件不能存在重叠部分。
步骤403,根据拖动指令对第一视角方向进行调整,得到第二视角方向。
可选地,第一视角方向以及第二视角方向都是用于在虚拟环境中对虚拟环境进行观察的方向。
可选地,根据拖动指令对第一视角方向进行调整的方式包括如下方式中的任意一种:
第一,该第一视角方向是通过虚拟环境中的摄像头模型对虚拟环境进行观察的方向,则根据拖动指令对该摄像头模型进行调整;
第二,该第一视角方向是对虚拟环境的环境画面进行采集时对应的视角方 向,则根据拖动指令对虚拟环境的环境画面进行调整,根据拖动指令将用户界面外的画面调整至用户界面内。
步骤404,显示第二视角画面,该第二视角画面是在虚拟环境中以第二视角方向观察虚拟对象时的画面。
可选地,该第二视角画面中包括上述具有第一朝向的虚拟对象。
可选地,该第二视角画面上也叠加显示有该视角调整控件。
示意性的,请参考图6,用户在图5所示的用户界面51中对视角调整控件54进行向右拖动操作后,终端接收到对该视角调整控件的拖动指令后,对第一视角方向进行调整,得到第二视角方向,并在用户界面61中显示第二视角画面62,该第二视角画面62中包括上述具有第一朝向的虚拟对象53,该虚拟对象53的第一朝向依旧为朝向虚拟环境中的门口55,可选地,该第二视角画面62上还叠加显示有视角调整控件54。
值得注意的是,上述实施例中对视角方向进行调整,可以是对视角方向在虚拟环境中进行旋转,也可以是对视角方向在虚拟环境中进行平移。
综上所述,本实施例提供的在虚拟环境中进行视角调整的方法,通过设置视角调整控件,当用户在第一用户界面上对视角调整控件进行控制时,可以达到对观察虚拟环境的视角方向进行调整而不改变虚拟对象的朝向的效果,即在虚拟对象维持原有朝向,保持原有运动姿态的同时,用户可以通过控制视角调整控件对虚拟环境的其他视角方向进行观察,也即,用户不会因为对其他视角方向进行观察而改变虚拟对象在虚拟环境中的行进方向,避免了视角调整操作与虚拟对象的运动轨迹之间相互影响。
在一个可选的实施例中,该第一视角方向是在虚拟环境中通过摄像机模型对虚拟对象进行观察时的方向,图7是本申请另一个示例性的实施例提供的在虚拟环境中进行视角调整的方法流程图,如图7所示,该在虚拟环境中进行视角调整的方法包括:
步骤701,显示第一视角画面,该第一视角画面是在虚拟环境中以第一视角方向观察虚拟对象时的画面。
可选地,该第一视角画面中包括具有第一朝向的虚拟对象,该第一视角画面上还叠加显示有视角调整控件。
可选地,该第一视角方向是在虚拟环境中通过摄像机模型对虚拟对象进行 观察时的方向。可选地,该第一视角方向是摄像机模型在虚拟环境中位于虚拟对象后方对该虚拟对象进行观察时的方向,也即,从该虚拟对象的后方指向该虚拟对象的方向。
步骤702,接收对视角调整控件的拖动指令。
可选地,在接收到拖动指令后,还可以对虚拟对象的第一朝向进行锁定操作,该锁定操作用于控制虚拟对象的第一朝向不被改变。
步骤703,确定拖动指令的拖动方向。
可选地,用户在对视角调整控件进行拖动时,可以对视角调整控件向用户界面的平面中的任意方向进行拖动,示意性的,请参考图8及图9,图8及图9是对拖动方向进行确定的两种划分方式,图8中以将视角调整控件81向左拖动为0°,按顺时针方向对角度递增直至360°,图9中以将视角调整控件91向左拖动为0°,按顺时针方向对角度递增直至180°,按逆时针方向对角度递减直至-180°。
步骤704,根据拖动指令将摄像机模型以虚拟对象为旋转中心,以拖动方向对应的方向为旋转方向,在虚拟环境中进行旋转。
可选地,该拖动指令还用于指示拖动距离,该拖动指令的拖动距离与摄像机模型在转动过程中的转动角度呈正相关关系。
可选地,摄像机模型的旋转方向与拖动方向相对应,以摄像机模型在旋转中心为球心的球面上进行旋转为例进行说明,首先根据拖动指令在球面上作过摄像机模型的与该拖动方向对应的圆,并根据拖动指令对摄像机模型在该圆上作顺时针旋转或者逆时针旋转。
值得注意的是,上述摄像机模型在虚拟环境中进行旋转的过程中,不仅有摄像机模型角度上的转动,还有该摄像机模型在虚拟环境中由旋转而产生的位移。
示意性的,请参考图10,图10是本实施例一个具体地摄像机模型根据拖动指令进行旋转的举例,如图10所示,在用户界面1010中包括视角调整控件1020以及具有第一朝向的虚拟对象1030,用户对视角调整控件1020向右边进行拖动操作,结合上述图8或者图9可知,用户将该视角调整控件1020向180°方向进行拖动,由于该180°方向在虚拟环境中对应水平方向,在摄像机模型1041所在的球面作一个水平方向的圆,根据拖动方向为180°,将摄像机模型1041在该圆上沿顺时针进行转动,并根据拖动距离确定转动角度,如:拖动距离每1 个单位时,将摄像机模型1041沿旋转角度转动15°,本实施例中,以摄像机模型1041旋转至摄像机模型1042的位置为例。
步骤705,将旋转后的摄像机模型对虚拟环境进行观察的视角方向确定为第二视角方向。
示意性的,结合图10可知,将旋转后的摄像机模型1042对虚拟对象1030进行观察的方向确定为第二视角方向。
可选地,当检测到摄像机模型在旋转过程中存在阻碍时,将该摄像机模型被阻碍时的视角方向确定为第二视角方向。
其中,摄像机模型在旋转过程中存在阻碍的可能性包括如下情况中的至少一种:
第一,在摄像机模型的旋转过程中被穿帮检测限制,其中,穿帮检测是指在虚拟环境中根据物理原理对虚拟对象能否对旋转后的视角方向进行观察进行检测,当检测到摄像机模型在旋转过程中被穿帮检测限制旋转时,将被穿帮检测限制时的视角方向确定为第二视角方向;
可选地,该穿帮检测是指在摄像机模型的旋转过程中,当该摄像机模型不被障碍物阻挡时,虚拟对象能否对旋转后的视角方向进行观察进行检测,如:摄像机模型在旋转过程中,从窗口之外转动到窗口内,以对房间的内部进行观察,而根据物理原理,虚拟对象在其所处的位置无法对房间内进行观察时,则认为该摄像机模型在旋转过程中被穿帮检测限制旋转。
第二,在摄像机模型旋转的过程中被障碍物阻挡,该障碍物是在该虚拟环境中三维模型,当检测到摄像机模型在旋转过程中存在障碍物阻挡时,将摄像机模型被障碍物阻挡时的视角方向确定为第二视角方向。
可选地,摄像机模型在旋转过程中是否被障碍物阻挡可以通过在虚拟环境中进行碰撞检测进行检测。由于在虚拟环境中,可以对物体与物体之间进行碰撞检测,如:按照摄像机模型的旋转路径进行碰撞检测,判断该摄像机模型在旋转过程中是否会与障碍物进行碰撞,当摄像机模型在旋转过程中会与障碍物碰撞时,认为该摄像机模型被障碍物阻挡。
步骤706,显示第二视角画面,该第二视角画面是在虚拟环境中以第二视角方向观察虚拟对象时的画面。
可选地,该第二视角画面中包括上述具有第一朝向的虚拟对象。可选地,该第二视角画面上也叠加显示有该视角调整控件。
示意性的,结合图10,如用户界面1050所示,旋转后的摄像机模型1042对虚拟对象进行观察时的第二视角画面中包括该具有第一朝向的虚拟对象1030,以及在该第二视角画面上还叠加显示有视角调整控件1020。
可选地,此时用户对该视角调整控件1020依旧处于拖动状态,该拖动状态可以是暂停拖动状态,即持续对视角调整控件1020进行触摸,但暂停拖动,也可以是持续拖动状态。
可选地,在从第一视角画面向第二视角画面过渡的过程中,显示第三视角画面,该第三视角画面为通过将第一视角方向调整至第二视角方向的过程中的过渡视角方向观察虚拟环境时的画面。根据拖动指令将第三视角画面上叠加显示的视角调整控件在预设平移范围内沿拖动方向进行平移显示。请参考图10,将视角调整控件1020在预设平移范围内(框住该视角调整控件的虚线圆形范围内)进行平移显示,如:当拖动方向向右时,将视角调整控件向右平移直至到虚拟圆形的边缘位置。
综上所述,本实施例提供的在虚拟环境中进行视角调整的方法,通过设置视角调整控件,当用户在第一用户界面上对视角调整控件进行控制时,可以达到对观察虚拟环境的视角方向进行调整而不改变虚拟对象的朝向的效果,即在虚拟对象维持原有朝向,保持原有运动姿态的同时,用户可以通过控制视角调整控件对虚拟环境的其他视角方向进行观察,也即,用户不会因为对其他视角方向进行观察而改变虚拟对象在虚拟环境中的行进方向,提高了用户对虚拟环境进行观察的便捷程度。
本实施例提供的在虚拟环境中进行视角调整的方法,通过摄像机模型对虚拟环境中的虚拟对象进行观察,通过摄像机模型将虚拟对象的朝向与对虚拟环境的观察方向两者分离进行,用户可以通过控制视角调整控件旋转摄像机模型。对虚拟环境的其他视角方向进行观察,提高了用户对虚拟环境进行观察的便捷程度。
在一个可选的实施例中,在当检测到拖动指令停止时,终端将第二视角方向再调整至第一视角方向,请参考图11,图11是本申请另一个示例性的实施例提供的在虚拟环境中进行视角调整的方法流程图,该在虚拟环境中进行视角调整的方法包括:
步骤1101,显示第一视角画面,该第一视角画面是在虚拟环境中以第一视 角方向观察虚拟对象时的画面。
可选地,该第一视角画面中包括具有第一朝向的虚拟对象,该第一视角画面上还叠加显示有视角调整控件。
可选地,该第一视角方向是在虚拟环境中通过摄像机模型对虚拟对象进行观察时的方向。可选地,该第一视角方向是摄像机模型在虚拟环境中位于虚拟对象后方对该虚拟对象进行观察时的方向,也即,从该虚拟对象的后方指向该虚拟对象的方向。
步骤1102,接收对视角调整控件的拖动指令。
可选地,在接收到拖动指令后,还可以对虚拟对象的第一朝向进行锁定操作,该锁定操作用于控制虚拟对象的第一朝向不被改变。
步骤1103,确定拖动指令的拖动方向。
可选地,用户在对视角调整控件进行拖动时,可以对视角调整控件向用户界面的平面中的任意方向进行拖动,确定拖动指令的拖动方向已在步骤703中进行了具体说明,此处不再赘述。
步骤1104,根据拖动指令将摄像机模型以虚拟对象为旋转中心,以拖动方向对应的方向为旋转方向,在虚拟环境中进行旋转。
可选地,该拖动指令还用于指示拖动距离,该拖动指令的拖动距离与摄像机模型在转动过程中的转动角度呈正相关关系。
可选地,摄像机模型的旋转方向与拖动方向相对应,以摄像机模型在旋转中心为球心的球面上进行旋转为例进行说明,首先根据拖动指令在球面上作过摄像机模型的与该拖动方向对应的圆,并根据拖动指令对摄像机模型在该圆上作顺时针旋转或者逆时针旋转。
步骤1105,将旋转后的摄像机模型对虚拟环境进行观察的视角方向确定为第二视角方向。
步骤1106,显示第二视角画面,该第二视角画面是在虚拟环境中以第二视角方向观察虚拟对象时的画面。
可选地,该第二视角画面中包括上述具有第一朝向的虚拟对象。可选地,该第二视角画面上也叠加显示有该视角调整控件。
步骤1107,检测对视角调整控件的拖动指令。
可选地,该拖动指令的状态可以包括:拖动指令持续、拖动指令暂停或者拖动指令停止。其中,拖动指令持续是指用户对视角调整控件进行持续拖动, 拖动指令暂停是指用户对该视角调整控件持续进行触摸(或按压或点击)但暂停对该视角调整控件的拖动,拖动指令停止是指用户停止对该视角调整控件的触摸(或按压或点击),也停止对该视角调整控件的拖动。
步骤1108,当检测到拖动指令停止时,将第二视角方向沿预设轨迹自动调整至第一视角方向。
可选地,将第二视角方向沿预设轨迹自动调整至第一视角方向时,至少包括如下方式中的任意一种:
第一,记录从第一视角方向旋转至第二视角方向时的旋转路径,根据该旋转路径将第二视角方向沿该旋转路径旋转至第一视角方向;
第二,获取第二视角方向旋转至第一视角方向的最短旋转路径,并根据该最短旋转路径将第二视角方向旋转至第一视角方向。
其中,将第二视角方向调整至第一视角方向是将摄像机模型从第二位置旋转至第一位置。其中,第二位置为摄像机模型以第二视角方向观察虚拟环境时的位置,第一位置为摄像机模型以第一视角方向观察虚拟环境时的模型。
步骤1109,当虚拟对象的状态为站立状态或蹲下状态或趴下状态时,显示第一视角画面。
当虚拟对象的状态为站立状态或者蹲下状态或者趴下状态时,该虚拟对象在虚拟环境中的位置和朝向都没有发生改变,故将第二视角方向调整至第一视角方向后,直接显示第一视角画面。
步骤1110,当虚拟对象的状态为持续奔跑、持续步行、驾驶载具、乘坐载具中的任意一个时,显示更新后的第一视角画面。
该更新后的第一视角画面是以第一视角方向观察虚拟对象时,由于虚拟对象在虚拟环境中的移动而更新的画面,该更新后的第一视角画面中包括朝向第一方向的虚拟对象。由于虚拟对象在虚拟环境中的移动,虚拟对象在虚拟环境中的位置可能发生改变,所以,第一视角画面会随着虚拟对象在虚拟环境中的位置发生改变而进行更新。
当虚拟对象的状态为持续奔跑、持续步行、驾驶载具、乘坐载具中的任意一个时,该虚拟对象在虚拟环境中的朝向没有发生改变,但是虚拟对象在虚拟环境中的位置发生了变化,故将第二视角方向调整至第一视角方向后,显示更新后的第一视角画面。
综上所述,本实施例提供的在虚拟环境中进行视角调整的方法,通过设置 视角调整控件,当用户在第一用户界面上对视角调整控件进行控制时,可以达到对观察虚拟环境的视角方向进行调整而不改变虚拟对象的朝向的效果,即在虚拟对象维持原有朝向,保持原有运动姿态的同时,用户可以通过控制视角调整控件对虚拟环境的其他视角方向进行观察,也即,用户不会因为对其他视角方向进行观察而改变虚拟对象在虚拟环境中的行进方向,提高了用户对虚拟环境进行观察的便捷程度。
本实施例提供的在虚拟环境中进行视角调整的方法,通过摄像机模型对虚拟环境中的虚拟对象进行观察,通过摄像机模型将虚拟对象的朝向与对虚拟环境的观察方向两者分离进行,用户可以通过控制视角调整控件旋转摄像机模型。对虚拟环境的其他视角方向进行观察,提高了用户对虚拟环境进行观察的便捷程度。
本实施例提供的在虚拟环境中进行视角调整的方法,当检测到拖动指令停止时,将第二视角方向沿预设轨迹自动调整至第一视角方向,用户可以通过停止对视角调整控件进行拖动或者停止对视角调整控件进行触摸,将视角方向还原至第一视角方向,提高了用户对虚拟环境进行观察的便捷程度。
图12是本申请另一个示例性的实施例提供的在虚拟环境中进行视角调整的方法流程图,该在虚拟环境中进行视角调整的方法包括:
步骤1201,生成视角调整控件。
首先,终端判断该应用程序中的视角调整控件是否被开启,当该视角调整控件被开启时,生成视角调整控件,并将其叠加显示在视角画面之上。
步骤1202,判断视角调整控件是否被按下。
步骤1203,当视角调整控件被按下时,锁定虚拟对象的朝向,判断拖动方向。
当视角调整控件被按下时,说明用户需要对虚拟环境的其他角度进行观察,但需要虚拟对象保持现有朝向,则锁定该虚拟对象的朝向,并判断拖动操作的拖动方向。
步骤1204,摄像机模型朝向虚拟对象并跟随虚拟对象。
摄像机模型始终保持对虚拟对象进行观察的角度,并跟随该虚拟对象,即,当虚拟对象在运动过程中时,摄像机模型跟随该虚拟对象,并围绕该虚拟对象进行旋转。
步骤1205,摄像机模型向拖动方向对应的方向进行旋转。
具体旋转过程在上述步骤704中已进行了说明,此处不再赘述。
步骤1206,停止对视角调整控件的拖动后,还原摄像机模型的观察方向。
即将摄像机模型的观察方向还原至未对视角调整控件进行按下之前的观察方向。
步骤1207,遇到无意义视角时,进行摄像机自动跟随,显示合理内容。
在摄像机模型的旋转过程中,持续检测旋转得到的视角是否是无意义视角,该无意义视角即根据物理原理无法达到的视角。当旋转后的视角是无意义视角时,控制摄像机模型的视角不继续向该无意义视角进行旋转。
示意性的,该无意义视角可以是摄像机模型旋转至墙壁内的视角,或者摄像机模型旋转至地面之下的视角,或者摄像机模型旋转至窗口内观察房间内的虚拟环境时的视角,而根据物理原理,虚拟对象在所处的位置无法对房间内的虚拟环境进行观察。
可选地,在用户界面中包括两种操作,如图13所示:步骤1301,生成视角调整控件。步骤1302,判断是拖动视角区域还是拖动视角调整控件。步骤1303,当拖动视角调整控件时,摄像头模型与虚拟对象的朝向独立。步骤1304,虚拟对象的移动不改变。步骤1305,当拖动视角区域时,摄像头模型与虚拟对象的朝向同步,保持追尾视角。
即对视角区域进行拖动以及对视角调整控件进行拖动分为如下两种情况:
第一,拖动视角区域,即摄像机模型的观察视角与用户的朝向进同步,当对观察视角进行调整时,用户跟对摄像机模型的旋转进行转身操作,并摄像机模型对虚拟对象保持追尾视角,可选地,该视角区域可以是用户界面的左半边或者右半边,也可以是预先设定的一块区域;
第二,拖动视角调整控件,摄像机模型与虚拟对象的朝向独立,虚拟对象的移动不受改变。
图14是本申请一个示例性的实施例提供的在虚拟环境中进行视角调整的装置的结构框图,该装置可以设置在终端中,该在虚拟环境中进行视角调整的装置包括:显示模块1410、接收模块1420、调整模块1430;
显示模块1410,用于显示第一视角画面,第一视角画面是在虚拟环境中以第一视角方向观察虚拟对象时的画面,第一视角画面中包括具有第一朝向的虚 拟对象,第一视角画面上还叠加显示有视角调整控件;
接收模块1420,用于接收对视角调整控件的拖动指令;
调整模块1430,用于根据拖动指令对第一视角方向进行调整,得到第二视角方向;
显示模块1410,还用于显示第二视角画面,第二视角画面是在虚拟环境中以第二视角方向观察虚拟对象时的画面,第二视角画面中包括具有第一朝向的虚拟对象。
在一个可选的实施例中,第一视角方向是在虚拟环境中通过摄像机模型对虚拟对象进行观察时的方向;
如图15所示,调整模块1430,包括:
确定单元1431,用于确定拖动指令的拖动方向;
调整单元1432,用于根据拖动指令将摄像机模型以虚拟对象为旋转中心,以拖动方向对应的方向为旋转方向在虚拟环境中进行旋转;
确定单元1431,还用于将旋转后的摄像机模型对虚拟对象进行观察的方向确定为第二视角方向。
在一个可选的实施例中,确定单元1431,还用于当检测到摄像机模型在旋转过程中存在阻碍时,将摄像机模型被阻碍时的视角方向确定为第二视角方向。
在一个可选的实施例中,被阻碍时的视角方向为被穿帮检测限制时的视角方向,穿帮检测是指在虚拟环境中根据物理原理对虚拟对象能否对旋转后的视角方向进行观察进行检测;
确定单元1431,还用于当检测到摄像机模型在旋转过程中被穿帮检测限制旋转时,将被穿帮检测限制时的视角方向确定为第二视角方向。
在一个可选的实施例中,摄像机模型被阻碍时的视角方向为摄像机模型被障碍物阻挡时的视角方向;
确定单元1431,还用于当检测到摄像机模型在旋转过程中存在障碍物阻挡时,将摄像机模型被障碍物阻挡时的视角方向确定为第二视角方向。
在一个可选的实施例中,该装置,还包括:
锁定模块1440,用于对虚拟对象的第一朝向进行锁定操作,锁定操作用于控制虚拟对象的第一朝向不被改变。
在一个可选的实施例中,该装置,还包括:
检测模块1450,用于检测对视角调整控件的拖动指令;
调整模块1430,还用于当检测到拖动指令停止时,将第二视角方向沿预设轨迹自动调整至第一视角方向;
显示模块1410,还用于当虚拟对象的状态为站立状态或下蹲状态或趴下状态时,显示第一视角画面。
在一个可选的实施例中,该装置,还包括:
检测模块1450,用于检测对视角调整控件的控制信号;
调整模块1430,还用于当检测到控制信号停止时,将第二视角方向沿预设轨迹调整至第一视角方向;
显示模块1410,还用于当虚拟对象的状态为持续奔跑、持续步行、驾驶载具、乘坐载具中的任意一个时,显示更新后的第一视角画面,更新后的第一视角画面是以第一视角方向观察虚拟对象时,由于虚拟对象在虚拟环境中的移动而更新的画面,更新后的第一视角画面中包括朝向第一方向的虚拟对象。
在一个可选的实施例中,显示模块1410,还用于显示第三视角画面,第三视角画面为通过将第一视角方向调整至第二视角方向的过程中的过渡视角方向观察虚拟对象时的画面;根据拖动指令,将在第三视角画面上叠加显示的视角调整控件在预设平移范围内沿拖动方向进行平移显示。
图16示出了本发明一个示例性实施例提供的终端1600的结构框图。该终端1600可以是:智能手机、平板电脑、MP3播放器(Moving Picture Experts Group Audio Layer III,动态影像专家压缩标准音频层面3)、MP4(Moving Picture Experts Group Audio Layer IV,动态影像专家压缩标准音频层面4)播放器、笔记本电脑或台式电脑。终端1600还可能被称为用户设备、便携式终端、膝上型终端、台式终端等其他名称。
通常,终端1600包括有:处理器1601和存储器1602。
处理器1601可以包括一个或多个处理核心,比如4核心处理器、8核心处理器等。处理器1601可以采用DSP(Digital Signal Processing,数字信号处理)、FPGA(Field-Programmable Gate Array,现场可编程门阵列)、PLA(Programmable Logic Array,可编程逻辑阵列)中的至少一种硬件形式来实现。处理器1601也可以包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称CPU(Central Processing Unit,中央处理器);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施 例中,处理器1601可以在集成有GPU(Graphics Processing Unit,图像处理器),GPU用于负责显示屏所需要显示的内容的渲染和绘制。一些实施例中,处理器1601还可以包括AI(Artificial Intelligence,人工智能)处理器,该AI处理器用于处理有关机器学习的计算操作。
存储器1602可以包括一个或多个计算机可读存储介质,该计算机可读存储介质可以是非暂态的。存储器1602还可包括高速随机存取存储器,以及非易失性存储器,比如一个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器1602中的非暂态的计算机可读存储介质用于存储至少一个指令,该至少一个指令用于被处理器1601所执行以实现本申请中方法实施例提供的在虚拟环境中进行视角调整的方法。
在一些实施例中,终端1600还可选包括有:外围设备接口1603和至少一个外围设备。处理器1601、存储器1602和外围设备接口1603之间可以通过总线或信号线相连。各个外围设备可以通过总线、信号线或电路板与外围设备接口1603相连。具体地,外围设备包括:射频电路1604、触摸显示屏1605、摄像头1606、音频电路1607、定位组件1608和电源1609中的至少一种。
外围设备接口1603可被用于将I/O(Input/Output,输入/输出)相关的至少一个外围设备连接到处理器1601和存储器1602。在一些实施例中,处理器1601、存储器1602和外围设备接口1603被集成在同一芯片或电路板上;在一些其他实施例中,处理器1601、存储器1602和外围设备接口1603中的任意一个或两个可以在单独的芯片或电路板上实现,本实施例对此不加以限定。
射频电路1604用于接收和发射RF(Radio Frequency,射频)信号,也称电磁信号。射频电路1604通过电磁信号与通信网络以及其他通信设备进行通信。射频电路1604将电信号转换为电磁信号进行发送,或者,将接收到的电磁信号转换为电信号。可选地,射频电路1604包括:天线系统、RF收发器、一个或多个放大器、调谐器、振荡器、数字信号处理器、编解码芯片组、用户身份模块卡等等。射频电路1604可以通过至少一种无线通信协议来与其它终端进行通信。该无线通信协议包括但不限于:万维网、城域网、内联网、各代移动通信网络(2G、3G、4G及5G)、无线局域网和/或WiFi(Wireless Fidelity,无线保真)网络。在一些实施例中,射频电路1604还可以包括NFC(Near Field Communication,近距离无线通信)有关的电路,本申请对此不加以限定。
显示屏1605用于显示UI(User Interface,用户界面)。该UI可以包括图形、 文本、图标、视频及其它们的任意组合。当显示屏1605是触摸显示屏时,显示屏1605还具有采集在显示屏1605的表面或表面上方的触摸信号的能力。该触摸信号可以作为控制信号输入至处理器1601进行处理。此时,显示屏1605还可以用于提供虚拟按钮和/或虚拟键盘,也称软按钮和/或软键盘。在一些实施例中,显示屏1605可以为一个,设置终端1600的前面板;在另一些实施例中,显示屏1605可以为至少两个,分别设置在终端1600的不同表面或呈折叠设计;在再一些实施例中,显示屏1605可以是柔性显示屏,设置在终端1600的弯曲表面上或折叠面上。甚至,显示屏1605还可以设置成非矩形的不规则图形,也即异形屏。显示屏1605可以采用LCD(Liquid Crystal Display,液晶显示屏)、OLED(Organic Light-Emitting Diode,有机发光二极管)等材质制备。
摄像头组件1606用于采集图像或视频。可选地,摄像头组件1606包括前置摄像头和后置摄像头。通常,前置摄像头设置在终端的前面板,后置摄像头设置在终端的背面。在一些实施例中,后置摄像头为至少两个,分别为主摄像头、景深摄像头、广角摄像头、长焦摄像头中的任意一种,以实现主摄像头和景深摄像头融合实现背景虚化功能、主摄像头和广角摄像头融合实现全景拍摄以及VR(Virtual Reality,虚拟现实)拍摄功能或者其它融合拍摄功能。在一些实施例中,摄像头组件1606还可以包括闪光灯。闪光灯可以是单色温闪光灯,也可以是双色温闪光灯。双色温闪光灯是指暖光闪光灯和冷光闪光灯的组合,可以用于不同色温下的光线补偿。
音频电路1607可以包括麦克风和扬声器。麦克风用于采集用户及环境的声波,并将声波转换为电信号输入至处理器1601进行处理,或者输入至射频电路1604以实现语音通信。出于立体声采集或降噪的目的,麦克风可以为多个,分别设置在终端1600的不同部位。麦克风还可以是阵列麦克风或全向采集型麦克风。扬声器则用于将来自处理器1601或射频电路1604的电信号转换为声波。扬声器可以是传统的薄膜扬声器,也可以是压电陶瓷扬声器。当扬声器是压电陶瓷扬声器时,不仅可以将电信号转换为人类可听见的声波,也可以将电信号转换为人类听不见的声波以进行测距等用途。在一些实施例中,音频电路1607还可以包括耳机插孔。
定位组件1608用于定位终端1600的当前地理位置,以实现导航或LBS(Location Based Service,基于位置的服务)。定位组件1608可以是基于美国的GPS(Global Positioning System,全球定位系统)、中国的北斗系统或俄罗斯的 伽利略系统的定位组件。
电源1609用于为终端1600中的各个组件进行供电。电源1609可以是交流电、直流电、一次性电池或可充电电池。当电源1609包括可充电电池时,该可充电电池可以是有线充电电池或无线充电电池。有线充电电池是通过有线线路充电的电池,无线充电电池是通过无线线圈充电的电池。该可充电电池还可以用于支持快充技术。
在一些实施例中,终端1600还包括有一个或多个传感器1610。该一个或多个传感器1610包括但不限于:加速度传感器1611、陀螺仪传感器1612、压力传感器1613、指纹传感器1614、光学传感器1615以及接近传感器1616。
加速度传感器1611可以检测以终端1600建立的坐标系的三个坐标轴上的加速度大小。比如,加速度传感器1611可以用于检测重力加速度在三个坐标轴上的分量。处理器1601可以根据加速度传感器1611采集的重力加速度信号,控制触摸显示屏1605以横向视图或纵向视图进行用户界面的显示。加速度传感器1611还可以用于游戏或者用户的运动数据的采集。
陀螺仪传感器1612可以检测终端1600的机体方向及转动角度,陀螺仪传感器1612可以与加速度传感器1611协同采集用户对终端1600的3D动作。处理器1601根据陀螺仪传感器1612采集的数据,可以实现如下功能:动作感应(比如根据用户的倾斜操作来改变UI)、拍摄时的图像稳定、游戏控制以及惯性导航。
压力传感器1613可以设置在终端1600的侧边框和/或触摸显示屏1605的下层。当压力传感器1613设置在终端1600的侧边框时,可以检测用户对终端1600的握持信号,由处理器1601根据压力传感器1613采集的握持信号进行左右手识别或快捷操作。当压力传感器1613设置在触摸显示屏1605的下层时,由处理器1601根据用户对触摸显示屏1605的压力操作,实现对UI界面上的可操作性控件进行控制。可操作性控件包括按钮控件、滚动条控件、图标控件、菜单控件中的至少一种。
指纹传感器1614用于采集用户的指纹,由处理器1601根据指纹传感器1614采集到的指纹识别用户的身份,或者,由指纹传感器1614根据采集到的指纹识别用户的身份。在识别出用户的身份为可信身份时,由处理器1601授权该用户执行相关的敏感操作,该敏感操作包括解锁屏幕、查看加密信息、下载软件、支付及更改设置等。指纹传感器1614可以被设置终端1600的正面、背面或侧 面。当终端1600上设置有物理按键或厂商Logo时,指纹传感器1614可以与物理按键或厂商Logo集成在一起。
光学传感器1615用于采集环境光强度。在一个实施例中,处理器1601可以根据光学传感器1615采集的环境光强度,控制触摸显示屏1605的显示亮度。具体地,当环境光强度较高时,调高触摸显示屏1605的显示亮度;当环境光强度较低时,调低触摸显示屏1605的显示亮度。在另一个实施例中,处理器1601还可以根据光学传感器1615采集的环境光强度,动态调整摄像头组件1606的拍摄参数。
接近传感器1616,也称距离传感器,通常设置在终端1600的前面板。接近传感器1616用于采集用户与终端1600的正面之间的距离。在一个实施例中,当接近传感器1616检测到用户与终端1600的正面之间的距离逐渐变小时,由处理器1601控制触摸显示屏1605从亮屏状态切换为息屏状态;当接近传感器1616检测到用户与终端1600的正面之间的距离逐渐变大时,由处理器1601控制触摸显示屏1605从息屏状态切换为亮屏状态。
本领域技术人员可以理解,图16中示出的结构并不构成对终端1600的限定,可以包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
本领域普通技术人员可以理解上述实施例的各种方法中的全部或部分步骤是可以通过程序来指令相关的硬件来完成,该程序可以存储于一计算机可读存储介质中,该计算机可读存储介质可以是上述实施例中的存储器中所包含的计算机可读存储介质;也可以是单独存在,未装配入终端中的计算机可读存储介质。该计算机可读存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如图1至图10任一所述的在虚拟环境中进行视角调整的方法。
可选地,该计算机可读存储介质可以包括:只读存储器(ROM,Read Only Memory)、随机存取记忆体(RAM,Random Access Memory)、固态硬盘(SSD,Solid State Drives)或光盘等。其中,随机存取记忆体可以包括电阻式随机存取记忆体(ReRAM,Resistance Random Access Memory)和动态随机存取存储器(DRAM,Dynamic Random Access Memory)。上述本申请实施例序号仅仅为了 描述,不代表实施例的优劣。
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读存储器,磁盘或光盘等。
以上所述仅为本申请的较佳实施例,并不用以限制本申请,凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (20)

  1. 一种在虚拟环境中进行视角调整的方法,其特征在于,应用于终端中,所述方法包括:
    显示第一视角画面,所述第一视角画面是在所述虚拟环境中以第一视角方向观察虚拟对象时的画面,所述第一视角画面中包括具有第一朝向的所述虚拟对象,所述第一视角画面上还叠加显示有视角调整控件;
    接收对所述视角调整控件的拖动指令;
    根据所述拖动指令对所述第一视角方向进行调整,得到第二视角方向;
    显示第二视角画面,所述第二视角画面是在所述虚拟环境中以所述第二视角方向观察所述虚拟对象时的画面,所述第二视角画面中包括具有所述第一朝向的所述虚拟对象。
  2. 根据权利要求1所述的方法,其特征在于,所述第一视角方向是在所述虚拟环境中通过摄像机模型对所述虚拟对象进行观察时的方向;
    所述根据所述拖动指令对所述第一视角方向进行调整,得到第二视角方向,包括:
    确定所述拖动指令的拖动方向;
    根据所述拖动指令将所述摄像机模型以所述虚拟对象为旋转中心,以所述拖动方向对应的方向为旋转方向在所述虚拟环境中进行旋转;
    将旋转后的所述摄像机模型对所述虚拟对象进行观察的视角方向确定为所述第二视角方向。
  3. 根据权利要求2所述的方法,其特征在于,所述将旋转后的所述摄像机模型对所述虚拟对象进行观察的方向确定为所述第二视角方向,包括:
    当检测到所述摄像机模型在旋转过程中存在阻碍时,将所述摄像机模型被阻碍时的视角方向确定为所述第二视角方向。
  4. 根据权利要求3所述的方法,其特征在于,所述被阻碍时的视角方向为被穿帮检测限制时的视角方向,所述穿帮检测是指在所述虚拟环境中根据物理原理对虚拟对象能否对旋转后的视角方向观察进行检测;
    所述当检测到所述摄像机模型在旋转过程中存在阻碍时,将所述摄像机模型被阻碍时的视角方向确定为所述第二视角方向,包括:
    当检测到所述摄像机模型在旋转过程中被所述穿帮检测限制旋转时,将被所述穿帮检测限制时的视角方向确定为所述第二视角方向。
  5. 根据权利要求3所述的方法,其特征在于,所述摄像机模型被阻碍时的视角方向为所述摄像机模型被障碍物阻挡时的视角方向;
    所述当检测到所述摄像机模型在旋转过程中存在阻碍时,将所述摄像机模型被阻碍时的视角方向确定为所述第二视角方向,包括:
    当检测到所述摄像机模型在旋转过程中存在所述障碍物阻挡时,将所述摄像机模型被所述障碍物阻挡时的视角方向确定为所述第二视角方向。
  6. 根据权利要求1至5任一所述的方法,其特征在于,所述接收对所述视角调整控件的拖动指令之后,还包括:
    对所述虚拟对象的所述第一朝向进行锁定操作,所述锁定操作用于控制所述虚拟对象的所述第一朝向不被改变。
  7. 根据权利要求1至5任一所述的方法,其特征在于,所述显示第二视角画面之后,还包括:
    检测对所述视角调整控件的拖动指令;
    当检测到所述拖动指令停止时,将所述第二视角方向沿预设轨迹自动调整至所述第一视角方向;
    当所述虚拟对象的状态为站立状态或下蹲状态或趴下状态时,显示所述第一视角画面。
  8. 根据权利要求1至5任一所述的方法,其特征在于,所述显示第二视角画面之后,还包括:
    检测对所述视角调整控件的控制信号;
    当检测到所述控制信号停止时,将所述第二视角方向沿预设轨迹调整至所述第一视角方向;
    当所述虚拟对象的状态为持续奔跑、持续步行、驾驶载具、乘坐载具中的任意一个时,显示更新后的第一视角画面,所述更新后的第一视角画面是以所述第一视角方向观察所述虚拟对象时,由于所述虚拟对象在所述虚拟环境中的移动而更新的画面,所述更新后的第一视角画面中包括朝向所述第一方向的所述虚拟对象。
  9. 根据权利要求1至5任一所述的方法,其特征在于,所述接收对所述视角调整控件的拖动指令之后,还包括:
    显示第三视角画面,所述第三视角画面为通过将所述第一视角方向调整至所述第二视角方向的过程中的过渡视角方向观察所述虚拟对象时的画面;
    根据所述拖动指令,将在所述第三视角画面上叠加显示的所述视角调整控件在预设平移范围内沿所述拖动方向进行平移显示。
  10. 一种在虚拟环境中进行视角调整的装置,其特征在于,所述装置包括:
    显示模块,用于显示第一视角画面,所述第一视角画面是在所述虚拟环境中以第一视角方向观察虚拟对象时的画面,所述第一视角画面中包括具有第一朝向的所述虚拟对象,所述第一视角画面上还叠加显示有视角调整控件;
    接收模块,用于接收对所述视角调整控件的拖动指令;
    调整模块,用于根据所述拖动指令对所述第一视角方向进行调整,得到第二视角方向;
    所述显示模块,还用于显示第二视角画面,所述第二视角画面是在所述虚拟环境中以所述第二视角方向观察所述虚拟对象时的画面,所述第二视角画面中包括具有所述第一朝向的所述虚拟对象。
  11. 根据权利要求10所述的装置,其特征在于,所述第一视角方向是在所述虚拟环境中通过摄像机模型对所述虚拟对象进行观察时的方向;
    所述调整模块,包括:
    确定单元,用于确定所述拖动指令的拖动方向;
    调整单元,用于根据所述拖动指令将所述摄像机模型以所述虚拟对象为旋转中心,以所述拖动方向对应的方向为旋转方向在所述虚拟环境中进行旋转;
    所述确定单元,还用于将旋转后的所述摄像机模型对所述虚拟对象进行观察的视角方向确定为所述第二视角方向。
  12. 根据权利要求11所述的装置,其特征在于,所述确定单元,还用于当检测到所述摄像机模型在旋转过程中存在阻碍时,将所述摄像机模型被阻碍时的视角方向确定为所述第二视角方向。
  13. 根据权利要求12所述的装置,其特征在于,所述被阻碍时的视角方向为被穿帮检测限制时的视角方向,所述穿帮检测是指在所述虚拟环境中根据物理原理对虚拟对象能否对旋转后的视角方向观察进行检测;
    所述确定单元,还用于当检测到所述摄像机模型在旋转过程中被所述穿帮检测限制旋转时,将被所述穿帮检测限制时的视角方向确定为所述第二视角方向。
  14. 根据权利要求12所述的装置,其特征在于,所述摄像机模型被阻碍时的视角方向为所述摄像机模型被障碍物阻挡时的视角方向;
    所述确定单元,还用于当检测到所述摄像机模型在旋转过程中存在所述障碍物阻挡时,将所述摄像机模型被所述障碍物阻挡时的视角方向确定为所述第二视角方向。
  15. 根据权利要求10至14任一所述的装置,其特征在于,所述装置,还包括:
    锁定模块,用于对所述虚拟对象的所述第一朝向进行锁定操作,所述锁定操作用于控制所述虚拟对象的所述第一朝向不被改变。
  16. 根据权利要求10至14任一所述的装置,其特征在于,所述装置,还包括:
    检测模块,用于检测对所述视角调整控件的拖动指令;
    所述调整模块,还用于当检测到所述拖动指令停止时,将所述第二视角方向沿预设轨迹自动调整至所述第一视角方向;
    所述显示模块,还用于当所述虚拟对象的状态为站立状态或下蹲状态或趴下状态时,显示所述第一视角画面。
  17. 根据权利要求10至14任一所述的装置,其特征在于,所述装置,还包括:
    检测模块,用于检测对所述视角调整控件的控制信号;
    所述调整模块,还用于当检测到所述控制信号停止时,将所述第二视角方向沿预设轨迹调整至所述第一视角方向;
    所述显示模块,还用于当所述虚拟对象的状态为持续奔跑、持续步行、驾驶载具、乘坐载具中的任意一个时,显示更新后的第一视角画面,所述更新后的第一视角画面是以所述第一视角方向观察所述虚拟对象时,由于所述虚拟对象在所述虚拟环境中的移动而更新的画面,所述更新后的第一视角画面中包括朝向所述第一方向的所述虚拟对象。
  18. 根据权利要求10至14任一所述的装置,其特征在于,所述显示模块,还用于显示第三视角画面,所述第三视角画面为通过将所述第一视角方向调整至所述第二视角方向的过程中的过渡视角方向观察所述虚拟对象时的画面;根据所述拖动指令,将在所述第三视角画面上叠加显示的所述视角调整控件在预设平移范围内沿所述拖动方向进行平移显示。
  19. 一种终端,其特征在于,所述终端包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如权利要求1至9任一所述的在虚拟环境中进行视角调整的方法。
  20. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如权利要求1至9任一所述的在虚拟环境中进行视角调整的方法。
PCT/CN2019/078756 2018-04-16 2019-03-19 在虚拟环境中进行视角调整的方法、装置及可读存储介质 WO2019201047A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/937,332 US11151773B2 (en) 2018-04-16 2020-07-23 Method and apparatus for adjusting viewing angle in virtual environment, and readable storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810339749.7 2018-04-16
CN201810339749.7A CN108499105B (zh) 2018-04-16 2018-04-16 在虚拟环境中进行视角调整的方法、装置及存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/937,332 Continuation US11151773B2 (en) 2018-04-16 2020-07-23 Method and apparatus for adjusting viewing angle in virtual environment, and readable storage medium

Publications (1)

Publication Number Publication Date
WO2019201047A1 true WO2019201047A1 (zh) 2019-10-24

Family

ID=63382017

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/078756 WO2019201047A1 (zh) 2018-04-16 2019-03-19 在虚拟环境中进行视角调整的方法、装置及可读存储介质

Country Status (3)

Country Link
US (1) US11151773B2 (zh)
CN (1) CN108499105B (zh)
WO (1) WO2019201047A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112330823A (zh) * 2020-11-05 2021-02-05 腾讯科技(深圳)有限公司 虚拟道具的显示方法、装置、设备及可读存储介质
CN113713385A (zh) * 2021-09-14 2021-11-30 腾讯科技(深圳)有限公司 虚拟道具控制方法、装置、设备、介质及计算机程序产品
WO2022042435A1 (zh) * 2020-08-26 2022-03-03 腾讯科技(深圳)有限公司 虚拟环境画面的显示方法、装置、设备及存储介质
TWI844340B (zh) * 2023-04-21 2024-06-01 台達電子工業股份有限公司 運用於虛擬環境測量的運算裝置及運算方法

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108245888A (zh) * 2018-02-09 2018-07-06 腾讯科技(深圳)有限公司 虚拟对象控制方法、装置及计算机设备
CN108499105B (zh) 2018-04-16 2022-02-25 腾讯科技(深圳)有限公司 在虚拟环境中进行视角调整的方法、装置及存储介质
EP3608174B1 (en) * 2018-08-10 2021-03-24 Lg Electronics Inc. Vehicle display system for vehicle
CN109821237B (zh) * 2019-01-24 2022-04-22 腾讯科技(深圳)有限公司 视角转动的方法、装置、设备及存储介质
CN109840043B (zh) * 2019-01-30 2021-08-10 腾讯科技(深圳)有限公司 在虚拟环境中建造建筑物的方法、装置、设备及存储介质
CN110045827B (zh) * 2019-04-11 2021-08-17 腾讯科技(深圳)有限公司 虚拟环境中虚拟物品的观察方法、装置及可读存储介质
KR102149732B1 (ko) * 2019-04-17 2020-08-31 라쿠텐 인코포레이티드 표시 제어 장치, 표시 제어 방법, 프로그램, 및 비일시적인 컴퓨터 판독 가능한 정보 기록 매체
JP6924799B2 (ja) * 2019-07-05 2021-08-25 株式会社スクウェア・エニックス プログラム、画像処理方法及び画像処理システム
CN110393916B (zh) * 2019-07-26 2023-03-14 腾讯科技(深圳)有限公司 视角转动的方法、装置、设备及存储介质
CN110465073A (zh) * 2019-08-08 2019-11-19 腾讯科技(深圳)有限公司 虚拟环境中视角调整的方法、装置、设备及可读存储介质
CN110585707B (zh) * 2019-09-20 2020-12-11 腾讯科技(深圳)有限公司 视野画面显示方法、装置、设备及存储介质
CN110665230B (zh) * 2019-09-26 2020-11-13 腾讯科技(深圳)有限公司 虚拟世界中的虚拟角色控制方法、装置、设备及介质
CN110575671B (zh) * 2019-10-08 2023-08-11 网易(杭州)网络有限公司 游戏中视角的控制方法、装置及电子设备
CN110665226B (zh) * 2019-10-09 2024-08-23 网易(杭州)网络有限公司 游戏中虚拟对象的控制方法、设备和存储介质
CN111158469A (zh) * 2019-12-12 2020-05-15 广东虚拟现实科技有限公司 视角切换方法、装置、终端设备及存储介质
CN111327817B (zh) * 2020-01-19 2021-08-03 惠州Tcl移动通信有限公司 虚拟拍照方法及其系统、存储介质及终端设备
US11947119B2 (en) * 2020-02-17 2024-04-02 Sony Group Corporation Display control device, display control method, and recording medium
CN111399639B (zh) * 2020-03-05 2022-07-12 腾讯科技(深圳)有限公司 虚拟环境中运动状态的控制方法、装置、设备及可读介质
CN111710047A (zh) * 2020-06-05 2020-09-25 北京有竹居网络技术有限公司 一种信息展示方法、装置和电子设备
CN111784844B (zh) * 2020-06-09 2024-01-05 北京五一视界数字孪生科技股份有限公司 观察虚拟对象的方法、装置、存储介质及电子设备
CN112169330B (zh) * 2020-09-25 2021-12-31 腾讯科技(深圳)有限公司 虚拟环境的画面显示方法、装置、设备及介质
US12128322B2 (en) * 2020-11-12 2024-10-29 Tencent Technology (Shenzhen) Company Limited Method and apparatus for driving vehicle in virtual environment, terminal, and storage medium
CN112604305B (zh) * 2020-12-17 2022-11-18 腾讯科技(深圳)有限公司 虚拟对象的控制方法、装置、终端及存储介质
CN112657192B (zh) * 2020-12-25 2023-05-09 珠海西山居数字科技有限公司 一种碰撞检测方法及装置
US12064688B2 (en) * 2020-12-30 2024-08-20 Activision Publishing, Inc. Methods and systems for determining decal projections intersecting spatial units in a frame of a game space
CN113274729B (zh) * 2021-06-24 2023-08-22 腾讯科技(深圳)有限公司 基于虚拟场景的互动观察方法、装置、设备及介质
CN113730908B (zh) * 2021-09-15 2023-08-25 腾讯科技(深圳)有限公司 画面显示方法和装置、存储介质及电子设备
CN114529646A (zh) * 2022-02-10 2022-05-24 北京沃东天骏信息技术有限公司 一种线上盲盒的控制方法、装置、设备和介质
CN115155061A (zh) * 2022-06-23 2022-10-11 联想(北京)有限公司 一种信息处理方法、信息处理装置和电子设备

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102414641A (zh) * 2009-05-01 2012-04-11 微软公司 改变显示环境内的视图视角
CN104436657A (zh) * 2014-12-22 2015-03-25 青岛烈焰畅游网络技术有限公司 游戏控制方法、装置以及电子设备
CN104765905A (zh) * 2015-02-13 2015-07-08 上海同筑信息科技有限公司 基于bim的平面图和第一视角分屏同步显示方法和系统
EP2921938A1 (en) * 2014-03-18 2015-09-23 DreamWorks Animation LLC Interactive multi-rider virtual reality ride system
CN106774907A (zh) * 2016-12-22 2017-05-31 腾讯科技(深圳)有限公司 一种在虚拟场景中调整虚拟对象可视区域的方法及移动终端
CN106959812A (zh) * 2016-01-11 2017-07-18 北京英雄互娱科技股份有限公司 用于人机交互的方法和装置
CN107519641A (zh) * 2017-08-04 2017-12-29 网易(杭州)网络有限公司 控制游戏技能释放的方法、装置、存储介质及移动终端
CN107803024A (zh) * 2017-09-28 2018-03-16 网易(杭州)网络有限公司 一种射击控制方法及装置
CN108499105A (zh) * 2018-04-16 2018-09-07 腾讯科技(深圳)有限公司 在虚拟环境中进行视角调整的方法、装置及存储介质

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IL120867A0 (en) * 1997-05-20 1997-09-30 Cadent Ltd Computer user interface for orthodontic use
US8619080B2 (en) * 2008-09-08 2013-12-31 Disney Enterprises, Inc. Physically present game camera
US10030979B2 (en) * 2016-07-29 2018-07-24 Matterport, Inc. Determining and/or generating a navigation path through a captured three-dimensional model rendered on a device
US10147237B2 (en) * 2016-09-21 2018-12-04 Verizon Patent And Licensing Inc. Foreground identification for virtual objects in an augmented reality environment
JP6539253B2 (ja) * 2016-12-06 2019-07-03 キヤノン株式会社 情報処理装置、その制御方法、およびプログラム
CN107890664A (zh) * 2017-10-23 2018-04-10 网易(杭州)网络有限公司 信息处理方法及装置、存储介质、电子设备

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102414641A (zh) * 2009-05-01 2012-04-11 微软公司 改变显示环境内的视图视角
EP2921938A1 (en) * 2014-03-18 2015-09-23 DreamWorks Animation LLC Interactive multi-rider virtual reality ride system
CN104436657A (zh) * 2014-12-22 2015-03-25 青岛烈焰畅游网络技术有限公司 游戏控制方法、装置以及电子设备
CN104765905A (zh) * 2015-02-13 2015-07-08 上海同筑信息科技有限公司 基于bim的平面图和第一视角分屏同步显示方法和系统
CN106959812A (zh) * 2016-01-11 2017-07-18 北京英雄互娱科技股份有限公司 用于人机交互的方法和装置
CN106774907A (zh) * 2016-12-22 2017-05-31 腾讯科技(深圳)有限公司 一种在虚拟场景中调整虚拟对象可视区域的方法及移动终端
CN107519641A (zh) * 2017-08-04 2017-12-29 网易(杭州)网络有限公司 控制游戏技能释放的方法、装置、存储介质及移动终端
CN107803024A (zh) * 2017-09-28 2018-03-16 网易(杭州)网络有限公司 一种射击控制方法及装置
CN108499105A (zh) * 2018-04-16 2018-09-07 腾讯科技(深圳)有限公司 在虚拟环境中进行视角调整的方法、装置及存储介质

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022042435A1 (zh) * 2020-08-26 2022-03-03 腾讯科技(深圳)有限公司 虚拟环境画面的显示方法、装置、设备及存储介质
CN112330823A (zh) * 2020-11-05 2021-02-05 腾讯科技(深圳)有限公司 虚拟道具的显示方法、装置、设备及可读存储介质
CN112330823B (zh) * 2020-11-05 2023-06-16 腾讯科技(深圳)有限公司 虚拟道具的显示方法、装置、设备及可读存储介质
CN113713385A (zh) * 2021-09-14 2021-11-30 腾讯科技(深圳)有限公司 虚拟道具控制方法、装置、设备、介质及计算机程序产品
CN113713385B (zh) * 2021-09-14 2023-06-27 腾讯科技(深圳)有限公司 虚拟道具控制方法、装置、设备、介质及计算机程序产品
TWI844340B (zh) * 2023-04-21 2024-06-01 台達電子工業股份有限公司 運用於虛擬環境測量的運算裝置及運算方法

Also Published As

Publication number Publication date
CN108499105B (zh) 2022-02-25
CN108499105A (zh) 2018-09-07
US20200357163A1 (en) 2020-11-12
US11151773B2 (en) 2021-10-19

Similar Documents

Publication Publication Date Title
WO2019201047A1 (zh) 在虚拟环境中进行视角调整的方法、装置及可读存储介质
US11703993B2 (en) Method, apparatus and device for view switching of virtual environment, and storage medium
US11224810B2 (en) Method and terminal for displaying distance information in virtual scene
CN109529319B (zh) 界面控件的显示方法、设备及存储介质
WO2019153836A1 (zh) 虚拟环境中虚拟对象的姿态确定方法、装置及介质
WO2019153824A1 (zh) 虚拟对象控制方法、装置、计算机设备及存储介质
AU2020256776B2 (en) Method and device for observing virtual article in virtual environment, and readable storage medium
US20230033874A1 (en) Virtual object control method and apparatus, terminal, and storage medium
WO2019205881A1 (zh) 虚拟环境中的信息显示方法、装置、设备及存储介质
KR102693824B1 (ko) 가상 환경 관찰 방법, 기기 및 저장 매체
US11845007B2 (en) Perspective rotation method and apparatus, device, and storage medium
WO2019109778A1 (zh) 游戏对局结果的展示方法、装置及终端
CN110465073A (zh) 虚拟环境中视角调整的方法、装置、设备及可读存储介质
US11675488B2 (en) Method and apparatus for constructing building in virtual environment, device, and storage medium
CN111273780B (zh) 基于虚拟环境的动画播放方法、装置、设备及存储介质
JP2023528119A (ja) 仮想オブジェクトの制御方法、装置、機器及びコンピュータプログラム
US20220291791A1 (en) Method and apparatus for determining selected target, device, and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19788713

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19788713

Country of ref document: EP

Kind code of ref document: A1