Nothing Special   »   [go: up one dir, main page]

WO2024077897A1 - Procédé et appareil de commande d'affichage de scène virtuelle, support d'enregistrement et dispositif électronique - Google Patents

Procédé et appareil de commande d'affichage de scène virtuelle, support d'enregistrement et dispositif électronique Download PDF

Info

Publication number
WO2024077897A1
WO2024077897A1 PCT/CN2023/086883 CN2023086883W WO2024077897A1 WO 2024077897 A1 WO2024077897 A1 WO 2024077897A1 CN 2023086883 W CN2023086883 W CN 2023086883W WO 2024077897 A1 WO2024077897 A1 WO 2024077897A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
virtual scene
user interface
graphical user
virtual camera
Prior art date
Application number
PCT/CN2023/086883
Other languages
English (en)
Chinese (zh)
Inventor
刘震岳
Original Assignee
网易(杭州)网络有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 网易(杭州)网络有限公司 filed Critical 网易(杭州)网络有限公司
Publication of WO2024077897A1 publication Critical patent/WO2024077897A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5378Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for displaying an additional top view, e.g. radar screens or maps
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Definitions

  • the present disclosure relates to the field of virtual interaction technology, and in particular to a display control method, device, storage medium and electronic device for a virtual scene.
  • a method for controlling display of a virtual scene wherein a graphical user interface is provided through a terminal device, and content displayed by the graphical user interface includes a portion of the virtual scene, and the method comprises:
  • the virtual camera In response to a first sliding operation on the graphical user interface, the virtual camera is controlled to move in the virtual scene, and a first virtual scene image obtained by shooting the virtual camera according to the movement is synchronously displayed in the graphical user interface;
  • the target object in the virtual scene is displayed in a preset range area of the graphical user interface, the virtual camera is controlled to move to the target position, and a second virtual scene image at least partially including the target object is obtained by the virtual camera staying at the target position and photographing the target object;
  • the second virtual scene screen is continuously displayed in the graphical user interface.
  • a display control device for a virtual scene characterized in that a graphical user interface is provided through a terminal device, and the content displayed by the graphical user interface includes a part of the virtual scene, and the device includes:
  • a first response module used to respond to a first sliding operation on the graphical user interface, control the virtual camera to move in the virtual scene, and synchronously display a first virtual scene image obtained by shooting according to the movement of the virtual camera in the graphical user interface;
  • a second response module is used to respond to the movement of the virtual camera so that the target object in the virtual scene is displayed in a preset range area of the graphical user interface, control the virtual camera to move to the target position and obtain a second virtual scene picture at least partially including the target object obtained by the virtual camera staying at the target position and photographing the target object;
  • the display module is used to continuously display the second virtual scene image in the graphical user interface.
  • a computer-readable storage medium on which a computer program is stored, characterized in that when the program is executed by a processor, a display control method for a virtual scene as in the first aspect of the above-mentioned embodiment is implemented.
  • an electronic device comprising: one or more processors; a storage device for storing one or more programs, which, when the one or more programs are executed by the one or more processors, enables the one or more processors to implement a display control method for a virtual scene as in the first aspect of the above-mentioned embodiment.
  • a computer program product or a computer program including computer instructions, the computer instructions being stored in a computer-readable storage medium.
  • a processor of a computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the methods provided in the above-mentioned various optional implementations.
  • the display control method of the virtual scene controls the virtual camera to move in the virtual scene after responding to a first sliding operation acting on a graphical user interface, and synchronously displays a first virtual scene picture obtained by shooting according to the movement of the virtual camera in the graphical user interface, and after responding to the movement of the virtual camera so that the target object in the virtual scene is displayed in a preset range area of the graphical user interface, controls the virtual camera to move to the target position and obtains a second virtual scene picture at least partially including the target object obtained by the virtual camera staying at the target position to shoot the target object, and continuously displays the second virtual scene picture in the graphical user interface, thereby ensuring that when the target object is in the preset range area, the second virtual scene picture displayed in the graphical user interface remains fixed, so that the player can view the information of the target object more stably without frequently switching between the large map and the small map, thereby solving the current technical problem of poor convenience in exploring the target object and viewing the object information in the virtual scene.
  • FIG1 schematically shows a schematic diagram of an exemplary terminal device to which a display control method for a virtual scene and a display control device for a virtual scene according to one embodiment of the present disclosure can be applied;
  • FIG2 schematically shows a flow chart of a method for controlling display of a virtual scene according to one embodiment of the present disclosure
  • FIG3 schematically shows a graphical user interface diagram in a method for controlling display of a virtual scene according to one embodiment of the present disclosure
  • FIG4 schematically shows a flow chart of a method for controlling display of a virtual scene according to one embodiment of the present disclosure
  • FIG5 schematically shows a graphical user interface diagram in a method for controlling display of a virtual scene according to one embodiment of the present disclosure
  • FIG6 schematically shows a flow chart of a method for controlling display of a virtual scene according to one embodiment of the present disclosure
  • FIG7 schematically shows a structural block diagram of a display control device for a virtual scene according to one embodiment of the present disclosure
  • FIG8 schematically shows a schematic diagram of the structure of a computer system of a terminal device suitable for implementing one of the embodiments of the present disclosure.
  • the second is to update the virtual scene screen in a large range by sliding the virtual controls on the screen (such as a small map or other preset controls), but this method has a high cognitive cost for players, requiring players to be familiar with the distribution of each object in the small map, and requiring players to have a clear judgment and thinking about the target object to be searched before operation.
  • This process is accurate and non-instantaneous. In fact, the exploration of players in many virtual scenes is ambiguous. At the same time, players will think and judge some information in the virtual scene in real time according to the sliding operation. This process is real-time, so when exploring in the virtual scene, if a large sliding operation is performed, the virtual scene screen will be updated on a large scale, which makes it easy to miss the target object you want to search for.
  • FIG1 shows a schematic diagram of an exemplary terminal device to which a display control method for a virtual scene and a display control device for a virtual scene according to the embodiments of the present disclosure can be applied.
  • the terminal device may include one or more of terminal devices 101, 102, and 103.
  • Terminal devices 101, 102, and 103 may be various electronic devices with display screens, including but not limited to desktop computers, portable computers, smart phones, and tablet computers.
  • the display control method of the virtual scene in one embodiment of the present disclosure can be run on a local terminal device or a server.
  • the method can be implemented and executed based on a cloud interaction system, wherein the cloud interaction system includes a server and a client device.
  • cloud games can be run under the cloud interaction system.
  • cloud games refer to a game mode based on cloud computing.
  • the operation mode of cloud games the operating body of the game program and the main body of the game screen presentation are separated.
  • the storage and operation of the display control method of the virtual scene are completed on the cloud game server.
  • the client device is used for receiving and sending data and presenting the game screen.
  • the client device can be a display device with data transmission function close to the user side, such as a mobile terminal, a TV, a computer, a handheld computer, etc.; but the cloud game server in the cloud is used for information processing.
  • the player When playing the game, the player operates the client device to send an operation instruction to the cloud game server.
  • the cloud game server runs the game according to the operation instruction, encodes and compresses the game screen and other data, and returns it to the client device through the network. Finally, the client device decodes and outputs the game screen.
  • a local terminal device stores a game program and is used to present a game screen.
  • the local terminal device is used to interact with the player through a graphical user interface, that is, the game program is downloaded and installed and run through the conventional terminal device.
  • the local terminal device may provide the graphical user interface to the player in a variety of ways, for example, it may be rendered and displayed on the display screen of the terminal, or provided to the player through a holographic projection.
  • the local terminal device may include a display screen and a processor, the display screen is used to present a graphical user interface, the graphical user interface includes a game screen, and the processor is used to run the game, generate a graphical user interface, and control the graphical user interface.
  • the user interface is displayed on the display.
  • the computer-readable storage medium shown in the present disclosure may be a computer-readable signal medium or a computer-readable storage medium or any combination of the above two.
  • a computer-readable signal medium may include a data signal propagated in a baseband or as part of a carrier wave, which carries a computer-readable program code.
  • the computer-readable storage medium may be included in the terminal device described in the above embodiments; or it may exist alone without being assembled into the terminal device.
  • the above-mentioned computer-readable storage medium carries one or more programs, and when the above-mentioned one or more programs are executed by a terminal device, the terminal device implements the method in the following embodiments. For example, the terminal device can implement the various steps shown in Figure 2, etc.
  • the virtual scene can be a digital scene outlined by a computer, mobile phone, tablet computer or other intelligent terminal device through digital communication technology.
  • the digital scene can be on the display screen of the intelligent terminal device or projected onto other display devices.
  • the virtual scene can include buildings or structures such as houses, buildings, gardens, bridges, pools, etc., and can also include natural landscapes such as mountains, rivers, lakes, and any virtual items such as weapons, tools, and creatures. This exemplary embodiment does not specifically limit this.
  • the virtual character is a three-dimensional stereo model created based on animation skeleton technology. Each virtual character has its own shape and volume in the three-dimensional virtual environment and occupies a part of the space in the three-dimensional virtual environment.
  • the virtual scene screen presented by the graphical user interface can be, for example, a game screen.
  • the virtual camera set in the virtual scene takes the picture.
  • the virtual camera can obtain the virtual scene picture based on the first perspective, the second perspective or other perspectives, for example, it can also be a bird's-eye view, which is a perspective of observing the virtual environment from an aerial perspective.
  • a bird's-eye view is used, the virtual camera can be located above the virtual scene.
  • the picture obtained may only be a part of the virtual scene, such as a multiplayer online tactical competitive game (Multiplayer Online Battle Arena, referred to as MOBA); when the height of the virtual camera is high, the picture obtained can be the whole picture of the virtual scene.
  • MOBA multiplayer online tactical competitive game
  • the virtual scene is a simulation environment of the real world, or a semi-simulation and semi-fictitious virtual environment, or a purely fictitious virtual environment.
  • the virtual scene is any one of a two-dimensional virtual scene and a three-dimensional virtual scene.
  • the virtual environment can be the sky, land, ocean, etc., wherein the land includes environmental elements such as deserts and cities.
  • the virtual scene is a scene with complete game logic of virtual objects such as user control. For example, for sandbox 3D shooting games, the virtual scene is a 3D game world for players to control virtual objects to fight.
  • Example virtual scenes may include: at least one element of mountains, plains, rivers, lakes, oceans, deserts, skies, plants, buildings, and vehicles; for example, for 2D card games, the virtual scene is a scene for displaying released cards or displaying virtual objects corresponding to cards.
  • Example virtual scenes may include: an arena, a decisive battlefield, or other "field" elements or other elements that can display the status of card battles; for 2D or 3D multiplayer online tactical competitive games, the virtual scene is a 2D or 3D terrain scene for virtual objects to fight.
  • Example virtual scenes may include: canyon-style mountains, lines, rivers, classrooms, tables and chairs, podiums and other elements.
  • the image acquisition device of the virtual camera in the game world which is used to collect environmental information within the field of view of the subject object or in the virtual scene.
  • the game screen collected in the virtual scene also changes, and the collected game screen is displayed on the graphical user interface.
  • the game screen displayed in the graphical user interface is the environment where the subject object is in the virtual scene, that is, the "big map" in the embodiment of the present disclosure.
  • This example embodiment provides a method for controlling the display of a virtual scene.
  • a graphical user interface is provided by a terminal device.
  • the content displayed by the graphical user interface includes a portion of the virtual scene.
  • the method for controlling the display of the virtual scene includes the following steps 201 to 203:
  • Step 201 In response to a first sliding operation on a graphical user interface, a virtual camera is controlled to move in a virtual scene, and a first virtual scene image obtained by shooting the virtual camera in movement is synchronously displayed in the graphical user interface.
  • the first sliding operation is an operation for controlling the movement of the virtual camera in the virtual scene.
  • the first sliding operation can be a sliding operation of continuously sliding from any first position of the graphical user interface to any second position, or a sliding operation of continuously sliding from a first position in a preset response area to any second position, or a combination of other specified operations and sliding operations, such as long press operation + sliding operation, double-click operation + sliding operation, etc.
  • the present disclosure embodiment does not make specific limitations and can be specifically set according to actual conditions.
  • the graphical user interface displays part of the game scene in the virtual scene, and the displayed scene content is continuously adjusted with the player's first sliding operation.
  • the player controls the virtual camera to move in the virtual scene through the first sliding operation in the graphical user interface.
  • the screen of a fixed area size displayed in the graphical user interface is the first virtual scene screen.
  • Step 202 in response to the movement of the virtual camera, the target object in the virtual scene is displayed in a preset range area of the graphical user interface, the virtual camera is controlled to move to the target position and a second virtual scene image at least partially including the target object is obtained by the virtual camera staying at the target position and photographing the target object.
  • the player continuously moves the position of the virtual camera through the first sliding operation, and the game screen in the field of view of the virtual camera is constantly changing until the target object determined by the player is displayed in the preset range area of the graphical user interface.
  • the target object refers to one or more virtual objects that the player selects from the virtual objects in the virtual scene according to actual needs and wants to view information.
  • the example of the virtual object has been explained in detail in the above application environment and will not be repeated here.
  • the preset range area refers to a display hotspot set by the player in the graphical user interface.
  • the size and position of the display hotspot can be specifically set according to actual conditions, and the embodiment of the present disclosure is not specifically limited.
  • the target object in the embodiment of the present disclosure is displayed in the preset range area of the graphical user interface, which means that the target object and the preset range area have an intersection, that is, the target object is completely in the preset range area, or the target object is partially in the preset range area, and the embodiment of the present disclosure is not limited in any way.
  • the virtual camera moves to the target position, which refers to the position where the virtual camera can capture the target object. It can be the current position of the virtual camera, or it can be the position of the virtual camera that is corrected based on the current position so that the target object is at the center point of the preset range area or the center point of the graphical user interface.
  • the virtual camera stays at the target position to capture images of the target object and the virtual scene, and a second virtual scene image containing the target object can be obtained.
  • Step 203 Continuously display the second virtual scene image in the graphical user interface.
  • the virtual camera continues to display the second virtual scene image containing the target object in the captured virtual scene in the graphical user interface. That is, it can be understood that as long as the target object is within the preset range area, no matter how the player operates, the target object is locked and displayed in the graphical user interface.
  • the display control method of the virtual scene controls the virtual camera to move in the virtual scene after responding to the first sliding operation acting on the graphical user interface, and synchronously displays the first virtual scene picture obtained by shooting according to the movement of the virtual camera in the graphical user interface; after responding to the movement of the virtual camera so that the target object in the virtual scene is displayed in the preset range area of the graphical user interface, controls the virtual camera to move to the target position and obtains the second virtual scene picture at least partially including the target object obtained by the virtual camera staying at the target position to shoot the target object, and continuously displays the second virtual scene picture in the graphical user interface, so as to ensure that when the target object is in the preset range area, the second virtual scene picture displayed in the graphical user interface remains unchanged, and the player operation is simple, and there is no need to frequently switch between the large map and the small map; at the same time, it is avoided that in the traditional way, when exploring the virtual scene, a large-scale sliding operation causes a large-scale update and adjustment of the virtual scene picture
  • the display control method of the virtual scene further includes the following step A:
  • Step A In response to a second sliding operation for controlling the virtual camera to resume movement in the virtual scene, a third virtual scene image obtained by shooting according to the movement of the virtual camera is synchronously displayed in the graphical user interface.
  • the second sliding operation is a continuous operation with the first sliding operation.
  • the second sliding operation is an operation of continuing the sliding operation based on the first sliding operation, and the virtual object in the virtual scene corresponding to the end position of the second sliding operation is outside the preset range area, that is, the third virtual scene screen does not contain the target object.
  • a third virtual scene picture excluding the target object obtained by the movement of the virtual camera is synchronously displayed in the graphical user interface, that is, the lock between the virtual camera and the target object is released, so that the virtual camera continues to move with the second sliding operation, and the collected virtual scene content also keeps changing.
  • the second virtual scene where the target object is located is no longer fixedly displayed in the graphical user interface, but the scene image excluding the target object and actually collected by the virtual camera is displayed, that is, the third virtual scene picture is synchronously displayed.
  • the second virtual scene containing the target object can be locked and displayed in the graphical user interface to facilitate the player to view information, and the viewing effect and viewing stability are both high; when the virtual camera leaves the target position where the target object can be captured, the locking relationship between the virtual camera and the target object is released, so that the virtual camera moves with the player's third sliding operation, thereby achieving rapid movement in the virtual scene, and more efficient exploration of the target object in the virtual scene and viewing of the object information.
  • the virtual camera has a projection position mapped to the ground plane of the virtual scene, and the projection position moves as the virtual camera moves in the virtual scene.
  • the projection position can be the intersection of the extended optical axis of the virtual camera and the ground plane of the virtual scene, or the intersection of other virtual rays emitted along any fixed direction with the virtual camera as the source point and the ground of the virtual scene.
  • the display control method of the virtual scene also includes the following step B:
  • Step B referring to FIG. 3 , when the projection position of the virtual camera enters the target area 302 in the virtual scene through the first sliding operation, the target object 301 is displayed in the preset range area of the graphical user interface.
  • the target area 302 is the adsorption determination area of the target object 301.
  • the target object 301 can be determined, and the preset range area of the target object 301 displayed in the graphical user interface is determined.
  • the target area 302 is a scene area determined according to the position of the target object 301, for example, it can be a rectangular area with a side length of L*L (4 ⁇ L ⁇ 10, the value is configurable) with the target object 301 as the center point, or a circle with a radius of L or other shapes, etc., which are not specifically limited in the embodiments of the present disclosure.
  • the disclosed embodiment sets a target area 302.
  • the projection position of the virtual camera enters the target area 302 in the virtual scene through a first sliding operation, it can be determined that the virtual object has entered the preset range area, and the target object 301 is displayed in the preset range area of the graphical user interface, which can reduce the player's operation accuracy requirements and thus improve the convenience of the player's operation.
  • controlling the virtual camera to move to the target position in the above step 202 includes:
  • Step C canceling the first control association between the first sliding operation and controlling the movement of the virtual camera, determining the target position according to the position of the target object in the virtual scene, and controlling the virtual camera to move to the target position.
  • the first control association is used to control the virtual camera to move according to the first sliding operation. That is to say, when the target object is outside the preset range area, the first sliding operation is associated with the virtual camera, and the virtual camera moves with the movement of the first sliding operation; when the target object is in the preset range area, the virtual camera is disassociated from the target object and locks the target object for shooting.
  • the virtual camera is always at the target position and continues to shoot the target object at this position, regardless of the first sliding operation.
  • the determination of the target position can refer to the above step 202, which will not be repeated here.
  • the disclosed embodiment releases the first control association between the first sliding operation and the control of the movement of the virtual camera when the target object is outside the preset range area, determines the target position according to the position of the target object in the virtual scene, and controls the virtual camera to move to the target position. This can achieve stable shooting of the target object at the target position, and obtain stable and continuous information about the target object, which is convenient for players to view. There is no need to frequently switch between the large map and the small map, and the information viewing of the target object is more convenient, and the information display is more stable and durable.
  • the method further includes the following step D:
  • Step D The first sliding operation establishes a second control association with the operation positioning position in the virtual scene.
  • the operation positioning position refers to any position point in the target area in the virtual scene.
  • the player continues to move through the first sliding operation, and the movement is mapped to the virtual scene, which corresponds to the operation positioning position. That is, after the terminal device releases the first control association between the first sliding operation and the control of the movement of the virtual camera, it first constructs the second control association between the first sliding operation and the operation positioning position in the virtual scene, and determines the mapping relationship of the first sliding operation in the target area through the second control association; the second control association is used to determine the operation positioning position corresponding to the first sliding operation.
  • step A in response to the second sliding operation for controlling the virtual camera to resume movement in the virtual scene, synchronously displaying the third virtual scene image obtained by shooting according to the movement of the virtual camera in the graphical user interface, includes the following steps 401-402:
  • Step 401 In response to a first sliding operation causing the operation positioning position to leave a target area in a virtual scene, a third control association between a second sliding operation and controlling the movement of a virtual camera is established.
  • the third control association is used to control the movement of the virtual camera according to the second sliding operation.
  • the terminal device establishes the third control association between the second sliding operation and the control of the movement of the virtual camera, and the association between the player's second sliding operation and the movement of the virtual camera or the second sliding operation and the position of the virtual camera is constructed through the third control association.
  • Step 402 synchronously displaying a third virtual scene image obtained by moving the virtual camera in the graphical user interface.
  • the terminal device can synchronously display the third virtual scene image obtained by moving the virtual camera in the graphical user interface in the same manner as in step A above, which will not be described in detail here.
  • the disclosed embodiment establishes a second control association between the first sliding operation and the operation positioning position in the virtual scene after releasing the first control association between the first sliding operation and the control of the movement of the virtual camera, and establishes a third control association between the second sliding operation and the control of the movement of the virtual camera after responding to the first sliding operation so that the operation positioning position leaves the target area in the virtual scene, and constructs different associations between the first sliding operation and the second sliding operation when the target area in the virtual scene is selected at the projection position of the virtual camera, and leaves the target area, so as to perform targeted display control and improve the operation control accuracy.
  • step 203 continuously displaying the second virtual scene image in the graphical user interface, includes the following step E:
  • Step E responding to the first sliding operation so that the operation positioning position does not leave the target area in the virtual scene, and continuously displaying the second virtual scene image in the graphical user interface.
  • the first sliding operation ensures that the operation positioning position does not leave the target area in the virtual scene, that is, the first sliding operation controls the movement of the virtual camera, and the projection position of the virtual camera in the virtual scene is still within the target area, that is, the current target object is still within the shooting range of the virtual camera. Therefore, the terminal device can continue to display the second virtual scene screen in the graphical user interface, and the screen display stability is higher, and the player's exploration of the target object in the virtual scene and viewing of object information are more effective.
  • the embodiment of the present disclosure ensures that the operation positioning position does not leave the target area in the virtual scene, and continuously displays the second virtual scene image in the graphical user interface to ensure that the corresponding second control association is acquired to control the mapping of the virtual camera in the virtual scene, thereby improving the reliability of the virtual scene display, facilitating players to view the information of the target object, and having higher reliability.
  • the graphical user interface provides a pair of game thumbnails of virtual scenes
  • the response area of the game thumbnails is the first area 501
  • the area other than the first area 501 in the graphical user interface 10 is the second area 502.
  • the first area 501 and the second area 502 are independent of each other, and the relative size and position of the two can be specifically set according to actual conditions, and are not specifically limited in the embodiment of the present disclosure.
  • the first sliding operation acting on the graphical user interface in the above step 201 includes: a sliding operation starting from the first area 501 and passing through the second area 502 .
  • the start instruction of the first sliding operation configuration in this embodiment is to start sliding from the game thumbnail, that is, the first area 501 of the small map, and the sliding area is the entire graphical user interface 10, that is, covering the second area 502.
  • the player only needs to start from the game thumbnail, and then move from the first position in the game thumbnail to the second position according to a certain mapping relationship.
  • the operation is simple and fast, and there is no need to frequently switch between the large map and the small map, and the player's operation efficiency is higher.
  • the sliding operation area is larger, and the player's operation is more convenient.
  • the above step 201 in response to the first sliding operation on the graphical user interface, controls the virtual camera to move in the virtual scene, and synchronously displays the first virtual scene image obtained by shooting according to the movement of the virtual camera in the graphical user interface, includes the following step F:
  • Step F in response to a first sliding operation on the graphical user interface, controlling the virtual camera to move at a first rate in the virtual scene, and synchronously displaying in the graphical user interface a first virtual scene image captured when the virtual camera moves at the first rate.
  • Step G in response to the third sliding operation in the second area 502, control the virtual camera to move at a second rate in the virtual scene, and synchronously display in the graphical user interface a fourth virtual scene image captured when the virtual camera moves at the second rate.
  • the first rate is greater than the second rate. That is, when the first sliding operation starts from the first area 501 of the game thumbnail and slides within the first area 501, or the first sliding operation starts from the first area 501 of the game thumbnail and slides to the second area 502 outside the game thumbnail, the virtual camera moves at a larger first rate to collect more scene information; when the third sliding operation starts from the second area 502 outside the game thumbnail, the virtual camera moves and shoots at a smaller second rate to provide corresponding scene information for the large map. In this way, corresponding scene information can be provided for different map models with higher reliability.
  • the target object includes: a target virtual model and a target scene area adjacent to the target virtual model, and the second virtual scene picture at least includes the target virtual model.
  • a virtual coordinate system is generally constructed, and each virtual model is located at a corresponding position in the coordinate system.
  • the target virtual model in the embodiment of the present disclosure refers to an object with a certain characteristic form in the virtual scene, such as mountains, rocks, lakes, virtual characters, etc.;
  • the target scene area refers to the scene position of the target virtual model in the virtual scene, that is, the specific position of the target virtual model in the virtual coordinate system, which can be represented by the position of the plot or the coordinates of the plot, for example.
  • the preset range area includes a full screen area, an edge area, or a screen area generated with the graphical user interface as a center point of the graphical user interface.
  • the embodiment of the present disclosure can set the specific position of the preset range area according to actual conditions, such as the full screen area, edge area, or screen area generated with the graphical user interface as the center point of the graphical user interface, which is more flexible.
  • the target position is the position of the virtual camera when the virtual camera is translated to a position where the camera shooting angle is aimed at the target object, thereby ensuring that the target object can be completely photographed by the virtual camera.
  • the virtual camera has a projection position mapped to the ground plane of the virtual scene
  • the target The position is the position of the virtual camera when the virtual camera is translated to the projection position that coincides with the position of the target object in the virtual scene.
  • the projection position is the position of the virtual camera optical axis
  • the target position is the position of the virtual camera when the virtual camera optical axis intersects with the target object, thereby ensuring that the target object is completely in the center of the shooting field of view of the virtual camera, thereby ensuring the integrity and reliability of the target object information, making it easier for players to view information, and making information viewing more convenient.
  • step 201 in response to the first sliding operation on the graphical user interface, controlling the virtual camera to move in the virtual scene, includes the following steps 601 to 605:
  • Step 601 In response to a first sliding operation on a graphical user interface, a first movement vector of the first sliding operation is obtained.
  • Step 602 Determine a first displacement vector based on a first motion vector and a first preset parameter.
  • Step 603 Control the virtual camera to move in the virtual scene according to the first displacement vector.
  • Step 604 Determine a second displacement vector based on the first movement vector and the second preset parameter.
  • Step 605 Update the scene content displayed in the thumbnail according to the second displacement vector.
  • the first movement vector refers to the displacement vector of the actual operation of the player in the graphical user interface
  • the first displacement vector refers to the corresponding displacement vector of the first sliding operation mapped to the virtual scene.
  • the first preset parameter is used to characterize the mapping relationship between the movement vector of the first sliding operation of the player in the touch screen of the terminal device or the graphical user interface and the moving distance in the thumbnail of the virtual scene.
  • the second preset parameter is used to characterize the mapping relationship between the movement vector of the first sliding operation of the player in the touch screen of the terminal device or the graphical user interface and the moving distance of the large map in the virtual scene.
  • the disclosed embodiment determines the first displacement vector of the corresponding mapping in the large map and the second displacement vector in the thumbnail based on the first movement vector of the first sliding operation in the graphical user interface, and then controls the virtual camera to move in the virtual scene according to the first displacement vector, and updates the scene content displayed in the thumbnail according to the second displacement vector, thereby realizing the synchronous movement of the player's first sliding operation in the graphical user interface and the virtual scene and the thumbnail, and then realizing the synchronization of display, thereby improving the player's gaming experience.
  • the display control method of the virtual scene further includes the following step H:
  • Step H Determine the size of the first preset parameter and the second preset parameter according to the proportional relationship between the virtual scene and the thumbnail.
  • the size of the thumbnail is m1 length * n1 width
  • the size of the virtual scene is m2 length * n2 width
  • the sizes of the first preset parameter and the second preset parameter may also be adaptively adjusted and modified according to the resolution parameters of the terminal color environment, etc., which is not specifically limited in the embodiment of the present disclosure.
  • FIG. 7 shows a schematic architecture diagram of the display control device 700 for a virtual scene, including: a first response module 710, a second response module 720 and a display module 750, wherein:
  • the first response module 710 is used to respond to the first sliding operation on the graphical user interface, control the virtual camera to move in the virtual scene, and synchronously display the first virtual scene obtained by shooting according to the movement of the virtual camera in the graphical user interface. Simulated scene picture;
  • the second response module 720 is used to respond to the movement of the virtual camera so that the target object in the virtual scene is displayed in a preset range area of the graphical user interface, control the virtual camera to move to the target position and obtain a second virtual scene image at least partially including the target object obtained by the virtual camera staying at the target position to shoot the target object;
  • the display module 750 is used to continuously display the second virtual scene image in the graphical user interface.
  • the display control device 700 of the virtual scene proposed in the embodiment of the present disclosure controls the virtual camera to move in the virtual scene after responding to the first sliding operation acting on the graphical user interface, and synchronously displays the first virtual scene picture obtained by shooting according to the movement of the virtual camera in the graphical user interface, and after responding to the movement of the virtual camera so that the target object in the virtual scene is displayed in the preset range area of the graphical user interface, controls the virtual camera to move to the target position and obtains the second virtual scene picture at least partially including the target object obtained by the virtual camera staying at the target position to shoot the target object, and continuously displays the second virtual scene picture in the graphical user interface, thereby ensuring that when the target object is in the preset range area, the second virtual scene picture displayed in the graphical user interface is fixed, so that the player can view the information of the target object more stably without frequently switching between the large map and the small map, thereby solving the current technical problem of poor convenience in exploring the target object and viewing the object information in the virtual scene.
  • the display control device 700 of the virtual scene also includes a third response module 730, which is specifically used to respond to a second sliding operation used to control the virtual camera to resume movement in the virtual scene, and synchronously display a third virtual scene picture obtained by shooting according to the movement of the virtual camera in the graphical user interface, wherein the second sliding operation and the first sliding operation are continuous operations.
  • a third response module 730 which is specifically used to respond to a second sliding operation used to control the virtual camera to resume movement in the virtual scene, and synchronously display a third virtual scene picture obtained by shooting according to the movement of the virtual camera in the graphical user interface, wherein the second sliding operation and the first sliding operation are continuous operations.
  • the second virtual scene containing the target object can be locked and displayed in the graphical user interface to facilitate the player to view information, and the viewing effect and viewing stability are both high; when the virtual camera leaves the target position where the target object can be collected, the locking relationship between the virtual camera and the target object is released, so that the virtual camera moves with the third sliding operation of the player, thereby achieving rapid movement in the virtual scene, and the exploration of the target object in the virtual scene and the viewing of the object information are more efficient. There is no need to frequently switch operations during the exploration of the target object in the virtual scene and the viewing of the object information, which greatly improves the convenience of player operation while ensuring the display effect of the large and small maps.
  • the virtual camera has a projection position mapped to the ground plane of the virtual scene, and the projection position moves as the virtual camera moves in the virtual scene; when the projection position of the virtual camera enters a target area in the virtual scene through a first sliding operation, the target object is displayed in a preset range area of the graphical user interface, wherein the target area is a scene area determined according to the position of the target object.
  • the display control device of the above-mentioned virtual scene by setting a target area, when the projection position of the virtual camera enters the target area in the virtual scene through the first sliding operation, it can be judged that the virtual object has entered the preset range area, and the target object can be displayed in the preset range area of the graphical user interface, which can reduce the player's operation accuracy requirements and thus improve the convenience of the player's operation.
  • the second response module 720 is specifically used to release the first control association between the first sliding operation and the control of the movement of the virtual camera, determine the target position according to the position of the target object in the virtual scene, and control the virtual camera to move to the target position, wherein the first control association is used to control the virtual camera to move according to the first sliding operation.
  • the first control association between the first sliding operation and the control of the movement of the virtual camera can be released when the target object is outside the preset range area.
  • the target position is determined according to the position of the target object in the virtual scene, and the virtual camera is controlled to move to the target position. This can achieve stable shooting of the target object at the target position, and obtain stable and continuous information of the target object, which is convenient for players to view. There is no need to frequently switch between the large map and the small map. The information viewing of the target object is more convenient, and the information display is more stable and lasting.
  • the second response module 720 is also used to establish a second control association between the first sliding operation and the operation positioning position in the virtual scene, and the second control association is used to determine the operation positioning position corresponding to the first sliding operation; correspondingly, the third response module 730 is specifically used to respond to the first sliding operation so that the operation positioning position leaves the target area in the virtual scene, and establish a third control association between the second sliding operation and controlling the movement of the virtual camera, wherein the third control association is used to control the virtual camera to move according to the second sliding operation; the display module 750 is also used to synchronously display a third virtual scene image obtained by shooting according to the movement of the virtual camera in the graphical user interface.
  • the embodiment of the present disclosure can establish a second control association between the first sliding operation and the operation positioning position in the virtual scene after releasing the first control association between the first sliding operation and the control of the movement of the virtual camera, and establish a third control association between the second sliding operation and the control of the movement of the virtual camera after responding to the first sliding operation so that the operation positioning position leaves the target area in the virtual scene, and construct different associations between the first sliding operation and the second sliding operation when the target area in the virtual scene is selected at the projection position of the virtual camera, and leave the target area, so as to perform targeted display control and improve the operation control accuracy.
  • the second response module 720 is specifically configured to respond to the first sliding operation so that the operation positioning position does not leave the target area in the virtual scene, and continuously display the second virtual scene image in the graphical user interface.
  • the disclosed embodiment uses the above-mentioned second response module to continuously display the second virtual scene image in the graphical user interface in response to the first sliding operation so that the operation positioning position does not leave the target area in the virtual scene, so as to ensure that the corresponding second control association is collected to control the mapping of the virtual camera in the virtual scene, thereby improving the reliability of the virtual scene display, facilitating players to view the information of the target object, and having higher reliability.
  • the graphical user interface provides a pair of game thumbnails of virtual scenes, the response area of the game thumbnails is the first area, and the area other than the first area in the graphical user interface is the second area; the first sliding operation acting on the graphical user interface includes: a sliding operation starting from the first area and passing through the second area.
  • thumbnails above make the player's operation simple and quick, without the need to frequently switch between the large map and the small map, and the player's operation efficiency is higher. At the same time, the sliding operation area is larger, and the player's operation is more convenient.
  • the first response module 710 is specifically used to, in response to a first sliding operation on the graphical user interface, control the virtual camera to move at a first rate in the virtual scene, and synchronously display in the graphical user interface a first virtual scene image captured when the virtual camera moves at the first rate;
  • the display control device 700 of the virtual scene also includes a fourth response module 740, which is used to respond to the third sliding operation in the second area, control the virtual camera to move at a second rate in the virtual scene, and synchronously display in the graphical user interface a fourth virtual scene picture captured when the virtual camera moves at the second rate, wherein the first rate is greater than the second rate.
  • a fourth response module 740 which is used to respond to the third sliding operation in the second area, control the virtual camera to move at a second rate in the virtual scene, and synchronously display in the graphical user interface a fourth virtual scene picture captured when the virtual camera moves at the second rate, wherein the first rate is greater than the second rate.
  • the virtual camera when sliding in the first area or sliding from the first area of the game thumbnail to the second area outside the game thumbnail, the virtual camera can move at a larger first rate to collect more scene information; when the sliding operation starts from the second area outside the game thumbnail, the virtual camera moves and shoots at a smaller second rate to provide corresponding scene information for the large map and for different map models, with higher reliability.
  • the target object includes: a target virtual model and a target scene area adjacent to the target virtual model, and the second virtual scene image at least includes the target virtual model.
  • the preset range area includes a full screen area, an edge area, or a screen area generated with the graphical user interface as a center point of the graphical user interface.
  • the preset range area is the full screen area, edge area, or screen area generated with the graphical user interface as the center point of the graphical user interface, etc., which has higher flexibility.
  • the target position is the position of the virtual camera when the virtual camera is translated to a position where the camera shooting angle is aimed at the target object.
  • the target position is the position of the virtual camera when the virtual camera is translated to a position where the camera shooting angle is aimed at the target object, thereby ensuring that the target object can be completely photographed by the virtual camera.
  • the virtual camera has a projection position mapped to the ground plane of the virtual scene, and the target position is the position of the virtual camera when the virtual camera is translated to coincide with the projection position and the position of the target object in the virtual scene.
  • the target object is completely in the center of the shooting field of view of the virtual camera, thereby ensuring the integrity and reliability of the target object information, making it convenient for players to view information, and making information viewing more convenient.
  • the first response module 710 is specifically used to respond to a first sliding operation on a graphical user interface, obtain a first movement vector of the first sliding operation; determine a first displacement vector based on the first movement vector and a first preset parameter; control the virtual camera to move in the virtual scene according to the first displacement vector; determine a second displacement vector based on the first movement vector and a second preset parameter; and update the scene content displayed in the thumbnail according to the second displacement vector.
  • the disclosed embodiment uses the above-mentioned first response module to determine the first displacement vector of the corresponding mapping in the large map and the second displacement vector in the thumbnail based on the first movement vector of the first sliding operation in the graphical user interface, and then controls the virtual camera to move in the virtual scene according to the first displacement vector, and updates the scene content displayed in the thumbnail according to the second displacement vector, thereby realizing the synchronous movement of the player's first sliding operation in the graphical user interface and in the virtual scene and the thumbnail, and then realizing the synchronization of display, thereby improving the player's gaming experience.
  • the first response module 710 is further used to determine the size of the first preset parameter and the second preset parameter according to the proportional relationship between the virtual scene and the thumbnail.
  • the embodiment of the present disclosure can make the ratio of the first preset parameter to the second preset parameter the same as the ratio of the virtual scene to the thumbnail, thereby achieving synchronous mapping of the player's first sliding operation in the graphical user interface in the thumbnail and the virtual scene, as well as more accurate mapping and greater reliability.
  • FIG8 shows a schematic diagram of the structure of a computer system of a terminal device suitable for implementing an embodiment of the present disclosure.
  • the computer system includes a central processing unit (CPU), which can perform various appropriate actions and processes according to the program stored in the read-only memory (ROM) or the program loaded from the storage part to the random access memory (RAM).
  • CPU central processing unit
  • RAM random access memory
  • various programs and data required for system operation are also stored.
  • the (CPU), (ROM) and (RAM) are connected to each other through a bus.
  • the input/output (I/O) interface is also connected to the bus.
  • the following components are connected to the (I/O) interface: an input part including a keyboard, a mouse, etc.; an output part including a cathode ray tube (CRT), a liquid crystal display (LCD), etc., and a speaker; a storage part including a hard disk, etc.; and a communication part including a network interface card such as a local area network card (LAN card), a modem, etc.
  • the communication part performs communication processing via a network such as the Internet.
  • a drive is also connected to the (I/O) interface as needed.
  • Removable media such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, etc., are installed on the drive as needed so that the computer program read therefrom is installed into the storage part as needed.
  • the above modules may be one or more integrated circuits configured to implement the above methods, such as one or more application specific integrated circuits (ASIC), or one or more microprocessors (DSP), or one or more field programmable gate arrays (FPGA).
  • ASIC application specific integrated circuit
  • DSP microprocessors
  • FPGA field programmable gate arrays
  • the processing element may be a general-purpose processor, such as a CPU or other processor that can call program code.
  • these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
  • SOC system-on-a-chip
  • the disclosed devices and methods can be implemented in other ways.
  • the device embodiments described above are only schematic, for example, the division of units is only a logical function division, and there may be other division methods in actual implementation, such as multiple units or components can be combined or integrated into another system, or some features can be ignored or not executed.
  • Another point is that the mutual coupling or direct coupling or communication connection shown or discussed can be through some interfaces, indirect coupling or communication connection of devices or units, which can be electrical, mechanical or other forms.
  • each functional unit in each embodiment of the present disclosure may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit may be implemented in the form of hardware or in the form of hardware plus software functional units.
  • the above-mentioned integrated unit implemented in the form of a software functional unit can be stored in a computer-readable storage medium.
  • the above-mentioned software functional unit is stored in a storage medium, including a number of instructions for enabling a computer device (which can be a personal computer, a server, or a network device, etc.) or a processor (English: processor) to perform some steps of the method of each embodiment of the present disclosure.
  • Step 201 In response to a first sliding operation on a graphical user interface, a virtual camera is controlled to move in a virtual scene, and a first virtual scene image obtained by shooting the virtual camera in a synchronous manner is displayed in the graphical user interface;
  • Step 202 In response to the movement of the virtual camera, the target object in the virtual scene is displayed in a preset range area of the graphical user interface, the virtual camera is controlled to move to the target position, and a second virtual scene image at least partially including the target object is obtained by the virtual camera staying at the target position and photographing the target object;
  • Step 203 Continuously display the second virtual scene image in the graphical user interface.
  • the display control method of the virtual scene proposed in the embodiment of the present disclosure can control the virtual camera to move in the virtual scene after responding to the first sliding operation acting on the graphical user interface, and synchronously display the first virtual scene picture obtained by shooting according to the movement of the virtual camera in the graphical user interface, and after responding to the movement of the virtual camera so that the target object in the virtual scene is displayed in a preset range area of the graphical user interface, control the virtual camera to move to the target position and obtain a second virtual scene picture that at least partially includes the target object obtained by the virtual camera staying at the target position to shoot the target object, and continuously display the second virtual scene picture in the graphical user interface, thereby ensuring that when the target object is in the preset range area, the second virtual scene picture displayed in the graphical user interface is fixed, so that the player can view the information of the target object more stably without frequently switching between the large map and the small map, thereby solving the current technical problem of poor convenience in exploring the target object and viewing the object information in the virtual scene.
  • the sliding operation in response to a second signal for controlling the virtual camera to resume movement in the virtual scene, synchronously displays the third virtual scene picture obtained by moving the virtual camera in the graphical user interface, wherein the second sliding operation and the first sliding operation are continuous operations.
  • the second virtual scene containing the target object is locked and displayed in the graphical user interface to facilitate the player to view information, and the viewing effect and viewing stability are both high; when the virtual camera leaves the target position where the target object can be captured, the locking relationship between the virtual camera and the target object is released, so that the virtual camera moves with the player's third sliding operation, thereby achieving rapid movement in the virtual scene, and the exploration of the target object in the virtual scene and the viewing of the object information are more efficient. There is no need to frequently switch operations during the exploration of the target object in the virtual scene and the viewing of the object information, which greatly improves the convenience of player operation while ensuring the display effect of the large and small maps.
  • the virtual camera has a projection position mapped to the ground plane of the virtual scene, and the projection position moves as the virtual camera moves in the virtual scene; when the projection position of the virtual camera enters a target area in the virtual scene through a first sliding operation, the target object is displayed in a preset range area of the graphical user interface, wherein the target area is a scene area determined according to the position of the target object.
  • a first control association between a first sliding operation and controlling the movement of a virtual camera is released, a target position is determined according to a position of a target object in a virtual scene, and the virtual camera is controlled to move to the target position, wherein the first control association is used to control the movement of the virtual camera according to the first sliding operation.
  • the first control association between the first sliding operation and the control of the movement of the virtual camera can be released when the target object is outside the preset range area.
  • the target position is determined according to the position of the target object in the virtual scene, and the virtual camera is controlled to move to the target position. This can achieve stable shooting of the target object at the target position, and obtain stable and continuous information of the target object, which is convenient for players to view. There is no need to frequently switch between the large map and the small map.
  • the information viewing of the target object is more convenient, and the information display is more stable and durable.
  • a first sliding operation establishes a second control association with an operation positioning position in a virtual scene, and the second control association is used to determine the operation positioning position corresponding to the first sliding operation; correspondingly, in response to a second sliding operation used to control the virtual camera to resume movement in the virtual scene, a third virtual scene picture obtained by shooting according to the movement of the virtual camera is synchronously displayed in the graphical user interface, including: in response to the first sliding operation, the operation positioning position leaves the target area in the virtual scene, and a third control association is established between the second sliding operation and controlling the movement of the virtual camera, wherein the third control association is used to control the virtual camera to move according to the second sliding operation; and the third virtual scene picture obtained by shooting according to the movement of the virtual camera is synchronously displayed in the graphical user interface.
  • a second control association between the first sliding operation and the operation positioning position in the virtual scene can be established, and after responding to the first sliding operation so that the operation positioning position leaves the target area in the virtual scene, a third control association between the second sliding operation and the control of the movement of the virtual camera is established, and the target area in the virtual scene is performed at the projection position of the virtual camera, and different associations between the first sliding operation and the second sliding operation are constructed when leaving the target area, so as to perform targeted display control and achieve higher operation control accuracy.
  • the second virtual scene image in response to the first sliding operation so that the operation positioning position does not leave the target area in the virtual scene, is continuously displayed in the graphical user interface.
  • the operation positioning position in response to the first sliding operation, does not leave the target in the virtual scene.
  • Area continuously displaying the second virtual scene screen in the graphical user interface to ensure that the corresponding second control association controls the mapping of the virtual camera in the virtual scene, improves the reliability of the virtual scene display, and facilitates the player to view the information of the target object with higher reliability.
  • a graphical user interface provides a pair of game thumbnails of virtual scenes, the response area of the game thumbnails is a first area, and the area other than the first area in the graphical user interface is a second area; the first sliding operation acting on the graphical user interface includes: a sliding operation starting from the first area and passing through the second area.
  • thumbnails above make the player's operation simple and quick, without the need to frequently switch between the large map and the small map, and the player's operation efficiency is higher. At the same time, the sliding operation area is larger, and the player's operation is more convenient.
  • the virtual camera in response to a first sliding operation applied to a graphical user interface, the virtual camera is controlled to move at a first rate in the virtual scene, and a first virtual scene image captured when the virtual camera moves at the first rate is synchronously displayed in the graphical user interface; in response to a third sliding operation in the second area, the virtual camera is controlled to move at a second rate in the virtual scene, and a fourth virtual scene image captured when the virtual camera moves at the second rate is synchronously displayed in the graphical user interface, wherein the first rate is greater than the second rate.
  • the virtual camera when sliding in the first area or sliding from the first area of the game thumbnail to the second area outside the game thumbnail, the virtual camera can be moved at a larger first rate to collect more scene information; when the sliding operation starts from the second area outside the game thumbnail, the virtual camera moves and shoots at a smaller second rate to provide corresponding scene information for the large map and for different map models, with higher reliability.
  • the target object includes: a target virtual model and a target scene area adjacent to the target virtual model, and the second virtual scene picture at least includes the target virtual model.
  • the preset range area includes a full screen area, an edge area, or a screen area generated with the graphical user interface as a center point of the graphical user interface.
  • the preset range area is the full screen area, edge area, or screen area generated with the graphical user interface as the center point of the graphical user interface, which has higher flexibility.
  • the target position is the position of the virtual camera when the virtual camera is translated to a position where the camera shooting angle is aimed at the target object.
  • the target position is the position of the virtual camera when the virtual camera is translated to a position where the camera shooting angle is aimed at the target object, thereby ensuring that the target object can be completely photographed by the virtual camera.
  • the virtual camera has a projection position mapped to the ground plane of the virtual scene, and the target position is the position of the virtual camera when the virtual camera is translated to coincide with the projection position and the position of the target object in the virtual scene.
  • the target object is completely in the center of the shooting field of view of the virtual camera, thereby ensuring the integrity and reliability of the target object information, making it convenient for players to view information, and making information viewing more convenient.
  • a first movement vector of the first sliding operation is obtained; a first displacement vector is determined based on the first movement vector and a first preset parameter; a virtual camera is controlled to move in a virtual scene according to the first displacement vector; a second displacement vector is determined based on the first movement vector and the second preset parameter; and the scene content displayed in the thumbnail is updated according to the second displacement vector.
  • the first displacement vector corresponding to the mapping in the large map and the second displacement vector in the thumbnail are respectively determined, and then the first displacement vector is used to determine the first displacement vector corresponding to the mapping in the large map and the second displacement vector in the thumbnail.
  • the virtual camera is controlled to move in the virtual scene, and the scene content displayed in the thumbnail is updated according to the second displacement vector, so as to realize the player's first sliding operation in the graphical user interface and the synchronous movement in the virtual scene and the thumbnail, and then realize the synchronization of display, thereby improving the player's gaming experience.
  • the sizes of the first preset parameter and the second preset parameter are determined according to a proportional relationship between the virtual scene and the thumbnail.
  • the embodiment of the present disclosure can make the ratio of the first preset parameter to the second preset parameter the same as the ratio of the virtual scene to the thumbnail, thereby achieving synchronous mapping of the player's first sliding operation in the graphical user interface in the thumbnail and the virtual scene, as well as more accurate mapping and greater reliability.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Optics & Photonics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Procédé de commande d'affichage de scène virtuelle, comprenant : en réponse à une première opération de glissement agissant sur une interface graphique utilisateur, la commande à une caméra virtuelle de se déplacer dans une scène virtuelle et l'affichage de manière synchrone sur l'interface graphique utilisateur d'un premier plan d'image de scène virtuelle sur la base du mouvement de la caméra virtuelle (201) ; en réponse au mouvement de la caméra virtuelle, l'entraînement d'un objet cible dans la scène virtuelle à être affiché dans une zone de plage prédéfinie de l'interface graphique utilisateur, la commande de la caméra virtuelle pour qu'elle se déplace vers une position cible et l'acquisition d'une seconde image de scène virtuelle contenant au moins partiellement l'objet cible (202) ; et l'affichage en continu de la seconde image de scène virtuelle sur l'interface graphique utilisateur (203). La présente invention résout le problème technique de l'état de la technique de la mauvaise commodité pour la visualisation d'informations et pour des opérations de déplacement de position dans des scènes virtuelles, et permet d'obtenir l'effet technique d'amélioration de la commodité pour la visualisation d'informations et pour des opérations de déplacement de position dans les scènes virtuelles.
PCT/CN2023/086883 2022-10-14 2023-04-07 Procédé et appareil de commande d'affichage de scène virtuelle, support d'enregistrement et dispositif électronique WO2024077897A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211261252.0A CN115591234A (zh) 2022-10-14 2022-10-14 虚拟场景的显示控制方法、装置、存储介质及电子设备
CN202211261252.0 2022-10-14

Publications (1)

Publication Number Publication Date
WO2024077897A1 true WO2024077897A1 (fr) 2024-04-18

Family

ID=84847831

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/086883 WO2024077897A1 (fr) 2022-10-14 2023-04-07 Procédé et appareil de commande d'affichage de scène virtuelle, support d'enregistrement et dispositif électronique

Country Status (2)

Country Link
CN (1) CN115591234A (fr)
WO (1) WO2024077897A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115591234A (zh) * 2022-10-14 2023-01-13 网易(杭州)网络有限公司(Cn) 虚拟场景的显示控制方法、装置、存储介质及电子设备
CN116260956B (zh) * 2023-05-15 2023-07-18 四川中绳矩阵技术发展有限公司 一种虚拟现实拍摄方法及系统

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108355354A (zh) * 2018-02-11 2018-08-03 网易(杭州)网络有限公司 信息处理方法、装置、终端和存储介质
CN110141855A (zh) * 2019-05-24 2019-08-20 网易(杭州)网络有限公司 视角控制方法、装置、存储介质及电子设备
US20190299091A1 (en) * 2017-03-27 2019-10-03 Netease (Hangzhou) Network Co.,Ltd. Display control method and apparatus for game screen, storage medium, and electronic device
CN110694271A (zh) * 2019-10-21 2020-01-17 网易(杭州)网络有限公司 游戏场景中的相机姿态控制方法、装置及电子设备
CN111672106A (zh) * 2020-06-05 2020-09-18 腾讯科技(深圳)有限公司 虚拟场景显示方法、装置、计算机设备及存储介质
CN113559501A (zh) * 2021-07-29 2021-10-29 网易(杭州)网络有限公司 游戏中的虚拟单位选取方法及装置、存储介质及电子设备
US20210402298A1 (en) * 2018-12-19 2021-12-30 Netease (Hangzhou) Network Co.,Ltd. Method and Apparatus for Controlling Virtual Camera in Game
CN114159787A (zh) * 2021-12-13 2022-03-11 网易(杭州)网络有限公司 虚拟对象的控制方法、装置、电子设备及可读介质
CN115591234A (zh) * 2022-10-14 2023-01-13 网易(杭州)网络有限公司(Cn) 虚拟场景的显示控制方法、装置、存储介质及电子设备

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190299091A1 (en) * 2017-03-27 2019-10-03 Netease (Hangzhou) Network Co.,Ltd. Display control method and apparatus for game screen, storage medium, and electronic device
CN108355354A (zh) * 2018-02-11 2018-08-03 网易(杭州)网络有限公司 信息处理方法、装置、终端和存储介质
US20210402298A1 (en) * 2018-12-19 2021-12-30 Netease (Hangzhou) Network Co.,Ltd. Method and Apparatus for Controlling Virtual Camera in Game
CN110141855A (zh) * 2019-05-24 2019-08-20 网易(杭州)网络有限公司 视角控制方法、装置、存储介质及电子设备
CN110694271A (zh) * 2019-10-21 2020-01-17 网易(杭州)网络有限公司 游戏场景中的相机姿态控制方法、装置及电子设备
CN111672106A (zh) * 2020-06-05 2020-09-18 腾讯科技(深圳)有限公司 虚拟场景显示方法、装置、计算机设备及存储介质
CN113559501A (zh) * 2021-07-29 2021-10-29 网易(杭州)网络有限公司 游戏中的虚拟单位选取方法及装置、存储介质及电子设备
CN114159787A (zh) * 2021-12-13 2022-03-11 网易(杭州)网络有限公司 虚拟对象的控制方法、装置、电子设备及可读介质
CN115591234A (zh) * 2022-10-14 2023-01-13 网易(杭州)网络有限公司(Cn) 虚拟场景的显示控制方法、装置、存储介质及电子设备

Also Published As

Publication number Publication date
CN115591234A (zh) 2023-01-13

Similar Documents

Publication Publication Date Title
CN108257219B (zh) 一种实现全景多点漫游的方法
WO2024077897A1 (fr) Procédé et appareil de commande d'affichage de scène virtuelle, support d'enregistrement et dispositif électronique
US20240096039A1 (en) Display control apparatus, display control method, and program
CN112263837B (zh) 虚拟环境中的天气渲染方法、装置、设备及存储介质
US20180225880A1 (en) Method and Apparatus for Providing Hybrid Reality Environment
CN105339987B (zh) 针对陆地、空中和/或众包可视化的流形的图像提取和基于图像的渲染
CN108579083B (zh) 虚拟场景显示方法、装置、电子装置及存储介质
US8933965B2 (en) Method for calculating light source information and generating images combining real and virtual images
US20180276882A1 (en) Systems and methods for augmented reality art creation
WO2018077071A1 (fr) Procédé et appareil de génération d'image panoramique
CN101489150B (zh) 一种虚实混合的远程协同工作方法
WO2023280038A1 (fr) Procédé de construction d'un modèle tridimensionnel de scène réelle et appareil associé
CN114125310B (zh) 拍照方法、终端设备及云端服务器
CN106780707B (zh) 模拟场景中全局光照的方法和装置
US20100066732A1 (en) Image View Synthesis Using a Three-Dimensional Reference Model
CN111467803B (zh) 游戏中的显示控制方法及装置、存储介质、电子设备
CN116057577A (zh) 用于增强现实的地图
US20230347240A1 (en) Display method and apparatus of scene picture, terminal, and storage medium
CN112562051B (zh) 虚拟对象显示方法、装置、设备及存储介质
CN115082607B (zh) 虚拟角色头发渲染方法、装置、电子设备和存储介质
CN108205820B (zh) 平面的重建方法、融合方法、装置、设备及存储介质
CN116708862A (zh) 直播间的虚拟背景生成方法、计算机设备及存储介质
CN116485969A (zh) 体素对象生成方法、装置和计算机可读存储介质
CN115382208A (zh) 三维指引地图生成方法、装置、存储介质和电子装置
Piekarski et al. Tinmith-mobile outdoor augmented reality modelling demonstration

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23876109

Country of ref document: EP

Kind code of ref document: A1