Nothing Special   »   [go: up one dir, main page]

CN109806583B - User interface display method, device, equipment and system - Google Patents

User interface display method, device, equipment and system Download PDF

Info

Publication number
CN109806583B
CN109806583B CN201910068959.1A CN201910068959A CN109806583B CN 109806583 B CN109806583 B CN 109806583B CN 201910068959 A CN201910068959 A CN 201910068959A CN 109806583 B CN109806583 B CN 109806583B
Authority
CN
China
Prior art keywords
terminal
virtual object
environment
information
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910068959.1A
Other languages
Chinese (zh)
Other versions
CN109806583A (en
Inventor
张雅
李熠琦
周西洋
文晗
黄荣灏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201910068959.1A priority Critical patent/CN109806583B/en
Publication of CN109806583A publication Critical patent/CN109806583A/en
Application granted granted Critical
Publication of CN109806583B publication Critical patent/CN109806583B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Information Transfer Between Computers (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a user interface display method, device, equipment and system, and belongs to the technical field of computers. The method comprises the following steps: displaying a first user interface including a first environment screen of the virtual environment and a collaborative display control; sending a request signal when a signal triggered on the collaborative display control is received; receiving a feedback signal which is sent by a second terminal and carries unique identification information; and sending the unique identification information and the information of the target virtual object to a server, triggering the server to send the information corresponding to the associated picture obtained according to the information of the virtual object to a second terminal according to the unique identification information, and generating and displaying the associated picture after the second terminal receives the information of the associated picture. According to the application, the associated picture is displayed on the second terminal and the first terminal, so that a user can obtain more information of the virtual environment through the associated picture, and the use convenience of the application program is improved.

Description

User interface display method, device, equipment and system
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method, an apparatus, a device, and a system for displaying a user interface.
Background
On terminals such as smart phones and tablet computers, there are many applications of virtual environments, such as military simulation applications, third-person shooter Games (TPS Games), First-person shooter Games (FPS Games), Multiplayer Online Battle Games (MOBA Games), and the like.
When the terminal runs the application program, a display screen of the terminal displays a user interface of the application program, and the user interface usually displays an environment picture of a virtual environment. For example, an environment screen in which a virtual environment is viewed from a viewpoint obliquely above a virtual character is displayed in a user interface of the MOBA game.
Since the contents displayed in the environment screen displayed on the terminal are limited, the user acquires less information from the environment screen, resulting in poor convenience in use of the application program.
Disclosure of Invention
The embodiment of the application provides a user interface display method, device, equipment and system, which are used for solving the problem that the use convenience of an application program is poor due to the fact that the information quantity of an environment picture displayed by a terminal is small in the related art. The technical scheme is as follows:
in one aspect, an embodiment of the present application provides a user interface display method, where the method is executed by a first terminal, and the method includes:
displaying a first user interface of an application program, wherein a first environment picture and a collaborative display control are displayed in the first user interface, and the first environment picture comprises a picture of a first local area determined by a target position of a target virtual object in a virtual environment;
when a signal triggered on the cooperative display control is received, sending a request signal through a first short-range wireless communication component, wherein the request signal is used for requesting the application program in the second terminal to display an associated picture, and the associated picture is a picture displayed with the target virtual object;
receiving a feedback signal sent by a second terminal through a second short-distance wireless communication component through the first short-distance wireless communication component, wherein the feedback signal comprises unique identification information corresponding to the application program in the second terminal;
sending the unique identification information and the information of the target virtual object to a server, wherein the unique identification information is used for triggering the server to send the information corresponding to the associated picture to the second terminal; and the information of the target virtual object is used for assisting the server to acquire the information corresponding to the associated picture according to the information of the target virtual object.
In one aspect, an embodiment of the present application provides a user interface display method, where the method is executed by a second terminal, and the method includes:
receiving a request signal sent by the first terminal through the first short-distance wireless communication component through the second short-distance wireless communication component;
sending a feedback signal to the first terminal through the second short-distance wireless communication assembly, wherein the feedback signal carries unique identification information of an application program in the second terminal, and the unique identification information is used for triggering a server to send information corresponding to the associated picture;
receiving information corresponding to the associated picture sent by the server;
generating the associated picture according to the information corresponding to the associated picture;
and displaying a second user interface of the application program, wherein the associated picture is displayed in the second user interface.
In one aspect, an embodiment of the present application provides a user interface display method, where the method is performed by a server, and the method includes:
receiving information and unique identification information of a target virtual object sent by a first terminal; the target virtual object is a virtual object in an application program running in the first terminal, and the unique identification information is identification information corresponding to the application program in the second terminal;
acquiring information corresponding to an associated picture according to the information of the target virtual object, wherein the associated picture is a picture associated with the target virtual object;
and sending information corresponding to the associated picture to the application program in the second terminal according to the unique identification information, wherein the information corresponding to the associated picture is used for assisting the second terminal to generate the associated picture.
In one aspect, an embodiment of the present application provides a user interface display method, where the method is executed by a second terminal, and the method includes:
displaying a user interface of an application;
displaying a request window on the user interface, wherein the request window is a window in which an application program in the first terminal requests to display an associated picture, and an acceptance control is displayed in the request window;
and when a trigger signal triggered on the receiving control is received, displaying a second user interface, wherein the associated picture is displayed in the second user interface, and the associated picture displays a picture associated with a target virtual object, and the target virtual object is a virtual object which is active in a virtual environment and is controlled by the first terminal.
In one aspect, an embodiment of the present application provides a user interface display apparatus, where the apparatus is applied to a first terminal, and the apparatus includes:
the display module is used for displaying a first user interface of an application program, wherein a first environment picture and a collaborative display control are displayed in the first user interface, and the first environment picture comprises a picture of a first local area determined by a target position of a target virtual object in a virtual environment;
the first short-distance wireless communication module is used for sending a request signal through a first short-distance wireless communication component when receiving a signal triggered on the cooperative display control, wherein the request signal is used for requesting the application program in the second terminal to display an associated picture, and the associated picture is a picture displayed with the association with the target virtual object; receiving a feedback signal sent by a second terminal through a second short-distance wireless communication component through the first short-distance wireless communication component, wherein the feedback signal comprises unique identification information corresponding to the application program in the second terminal;
a sending module, configured to send the unique identification information and the information of the target virtual object to a server, where the unique identification information is used to trigger the server to send information corresponding to the associated screen to the second terminal; and the information of the target virtual object is used for assisting the server to acquire the information corresponding to the associated picture according to the information of the target virtual object.
In one aspect, an embodiment of the present application provides a user interface display apparatus, where the apparatus is applied to a second terminal, and the apparatus includes:
the second short-distance wireless communication module is used for receiving a request signal sent by the first terminal through the first short-distance wireless communication component through the second short-distance wireless communication component; sending a feedback signal to the first terminal through the second short-distance wireless communication assembly, wherein the feedback signal carries unique identification information of an application program in the second terminal, and the unique identification information is used for triggering a server to send information corresponding to the associated picture;
the receiving module is used for receiving the information corresponding to the associated picture sent by the server;
the processing module is used for generating the associated picture according to the information corresponding to the associated picture;
and the display module is used for displaying a second user interface of the application program, and the second user interface is displayed with the associated picture.
In one aspect, an embodiment of the present application provides a user interface display apparatus, where the apparatus is applied in a server, and the apparatus includes:
the receiving module is used for receiving the information and the unique identification information of the target virtual object sent by the first terminal; the target virtual object is a virtual object in an application program running in the first terminal, and the unique identification information is identification information corresponding to the application program in the second terminal;
the processing module is used for acquiring information corresponding to an associated picture according to the information of the target virtual object, wherein the associated picture is a picture associated with the target virtual object;
and the sending module is used for sending the information corresponding to the associated picture to the application program in the second terminal according to the unique identification information, wherein the information corresponding to the associated picture is used for assisting the second terminal to generate the associated picture.
In one aspect, the present embodiments provide a terminal including a processor, a memory and a first short-range wireless communication component, where the memory stores at least one instruction, and the instruction is loaded and executed by the processor to implement the user interface display method executed by the first terminal as described above.
In one aspect, the present embodiments provide a terminal including a processor, a memory and a second short-range wireless communication component, where the memory stores at least one instruction, and the instruction is loaded and executed by the processor to implement the user interface display method executed by the second terminal as described above.
In one aspect, an embodiment of the present application provides a computer device, where the computer device includes a processor and a memory, where the memory stores at least one instruction, and the instruction is loaded and executed by the processor to implement the user interface display method executed by the server as described above.
In one aspect, the present application provides a computer system, where the computer system includes the user interface display device applied to the first terminal as described above, the user interface display device applied to the second terminal as described above, and the user interface display device applied to the server as described above;
or the like, or, alternatively,
the computer system comprises a first terminal as described above, a second terminal as described above and a computer device as described above.
In one aspect, embodiments of the present application provide a computer-readable storage medium, where at least one instruction is stored, and the instruction is loaded and executed by a processor to implement the user interface display method described above.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
after the first terminal and the second terminal determine that the associated picture is displayed on the second terminal through respective short-distance wireless communication components, the server sends information corresponding to the associated picture to the second terminal, and the second terminal generates and displays the associated picture according to the information corresponding to the associated picture; because the second terminal near the first terminal displays the related picture related to the target virtual object, more information related to the target virtual object can be displayed through the second terminal, so that a user can obtain more information of the virtual environment, and the use convenience of the application program is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a block diagram of a computer system provided in an exemplary embodiment of the present application;
FIG. 2 is a flow chart of a user interface display method provided by an exemplary embodiment of the present application;
FIG. 3 is a schematic illustration of a user interface provided by an exemplary embodiment of the present application;
FIG. 4 is a schematic illustration of a user interface provided by an exemplary embodiment of the present application;
FIG. 5 is a flow chart of a user interface display method provided by an exemplary embodiment of the present application;
FIG. 6 is a schematic view of a selected one of the viewable areas provided by an exemplary embodiment of the present application;
FIG. 7 is a flowchart of a user interface display method provided by an exemplary embodiment of the present application;
FIG. 8 is a schematic illustration of a viewpoint of a three-dimensional virtual environment provided by an exemplary embodiment of the present application;
FIG. 9 is a schematic illustration of a user interface provided by an exemplary embodiment of the present application;
FIG. 10 is a block diagram of a user interface display device provided in an exemplary embodiment of the present application;
FIG. 11 is a block diagram of a user interface display device provided in an exemplary embodiment of the present application;
FIG. 12 is a block diagram of a user interface display device provided in an exemplary embodiment of the present application;
fig. 13 is a block diagram of a terminal provided in an exemplary embodiment of the present application;
FIG. 14 is a block diagram of a computer device provided in an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, a number of terms related to the embodiments of the present application will be briefly described:
short-range wireless communication: the method is a wireless communication mode realized in a short distance range. In the embodiment of the present application, the short-range Wireless Communication technology includes a Wireless-Fidelity Direct (Wi-Fi Direct) technology, a bluetooth (Blue to 13) technology, a bluetooth Wi-Fi technology, or a Near Field Communication (NFC) technology.
Wi-Fi Direct: the technology is to realize point-to-point data transmission in a shorter distance range by using Radio Frequency (RF).
Bluetooth (Blue to 13): the wireless data and voice communication technology is an open global specification, and the essential content of the wireless data and voice communication technology is to establish a general radio air interface for a communication environment between fixed equipment or mobile equipment, and further combine a communication technology and a computer technology, so that various electronic equipment can realize mutual communication or operation in a short distance range under the condition of no mutual connection of wires or cables.
Bluetooth Wi-Fi: and scanning through a Bluetooth technology to obtain the equipment identification of a nearby terminal, and carrying out data transmission through a Wi-FiDirect technology.
NFC: the near field communication module is a short-range high-frequency radio technology, has a function of bidirectional data transmission in a short distance range, and is configured in many terminals at present. When two terminals with NFC functions enter the working range of the near field communication assembly, bidirectional data transmission can be achieved.
Virtual environment: is a virtual environment that is displayed (or provided) when an application is run on the terminal. The virtual environment may be a simulation environment of a real world, a semi-simulation semi-fictional three-dimensional environment, or a pure fictional three-dimensional environment. The virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment. Optionally, the virtual environment is also used for a battle between at least two target virtual objects.
Virtual object: refers to a movable object in a virtual environment. The movable object may be at least one of a virtual character, a virtual animal, and an animation character. Optionally, when the virtual environment is a three-dimensional virtual environment, the target virtual object is a three-dimensional stereo model created based on an animated skeleton technique. Each target virtual object has its own shape and volume in the three-dimensional virtual environment, occupying a portion of the space in the three-dimensional virtual environment.
A viewing frame: is the field of view region of a virtual object in a two-dimensional virtual environment. Illustratively, the two-dimensional virtual environment includes a background map of 10000 × 10000 pixels, and a virtual object and a virtual article that are active in the background map, the finder frame is a box with a side length of 160 × 90 pixels, where a position where the virtual object is located is a reference point (for example, a central point), and the electronic device can find a view area corresponding to the virtual object through the finder frame, where the view area includes a local map of the virtual environment obtained through finding, the virtual object that is active on the local map, and the virtual article that is present on the local map.
Observation points are as follows: also referred to as a virtual camera, is the viewing position corresponding to a virtual object in a three-dimensional virtual environment. The viewpoint has a relative position and perspective with respect to the virtual object. For example, the coordinate difference between the viewpoint and the virtual object is (0, — Δ y1,. DELTA.z 1), the viewing angle is α, and the screen of the virtual environment displayed on the terminal is the environment screen of the virtual environment viewed by the viewpoint.
Fig. 1 is a block diagram illustrating a computer system according to an exemplary embodiment of the present application. The computer system 100 includes: a first terminal 110, a second terminal 120, and a server 130.
The first terminal 110 is installed and operated with an application program supporting a virtual environment. The application program can be any one of military simulation application programs, TPS games, FPS games, MOBA games and multi-player gunfight living games. The first terminal 110 is a terminal used by a first user who uses the first terminal 110 to control a target virtual object located in a virtual environment to perform an activity including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the target virtual object is a virtual character, such as a simulated character or an animated character.
The first terminal 110 and the second terminal 120 are connected to the server 130 through a wireless network or a wired network.
The server 130 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The server 130 is used to provide background services for applications that support virtual environments. Optionally, the server 130 undertakes primary computational work and the first terminal 110 and the second terminal 120 undertakes secondary computational work; alternatively, the server 130 undertakes the secondary computing work, and the first terminal 110 and the second terminal 120 undertake the primary computing work; or, the server 130, the first terminal 110, and the second terminal 120 perform cooperative computing by using a distributed computing architecture.
The first terminal 110 is provided with a first short-range wireless communication component, the second terminal 120 is provided with a second short-range wireless communication component, and the first terminal 110 and the second terminal 120 can exchange data through the short-range wireless communication through the first short-range wireless communication component and the second short-range wireless communication component.
When the first terminal 110 runs the application program, a first user interface is displayed on the display screen, a first environment picture of the virtual environment and a cooperative display control are displayed in the first user interface, when a signal triggered on the cooperative display control is received, a request signal is sent through the first short-range wireless communication component, the request signal is used for requesting to display an associated picture on the terminal receiving the request signal, the associated picture is a picture displayed and associated with the target virtual object, and when a feedback signal sent by the second terminal 120 through the second short-range wireless communication component is received, the unique identification information of the application program in the second terminal 120 and the information of the target virtual object, which are carried in the feedback signal, are sent to the server 130 through a wired or wireless network.
After receiving the unique identification information and the information of the target virtual object sent by the first terminal 110, the server 130 acquires information corresponding to the associated screen according to the information of the target virtual object; and transmitting information corresponding to the associated screen to the second terminal 120 through a wired or wireless network according to the unique identification information.
The second terminal 120 is installed with the same application as the first terminal 110, or the applications installed on both terminals are the same type of application for different control system platforms. After the application program is started, the second terminal 120 receives a request signal sent by the first terminal 110 through the first short-range wireless communication component through the second short-range wireless communication component, and then sends a feedback signal carrying unique identification information corresponding to the application program to the first terminal 110 through the second short-range wireless communication component, and when receiving information corresponding to the association screen sent by the server 130, generates an association screen according to the information corresponding to the association screen, and displays a second user interface including the association screen.
The first terminal 110 may generally refer to one of a plurality of terminals, and the second terminal 120 may generally refer to one of a plurality of terminals, and this embodiment is only illustrated by the first terminal 110 and the second terminal 120. The device types of the first terminal 110 and the second terminal 120 are the same or different, and include: at least one of a smart phone, a tablet computer, an electronic book reader, an MP3(Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3) player, and an MP4(Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4) player.
Those skilled in the art will appreciate that the number of terminals described above may be greater or fewer. For example, the number of the terminals may be only one, or several tens or hundreds of the terminals, or more. The number of terminals and the type of the device are not limited in the embodiments of the present application.
FIG. 2 is a flowchart illustrating a method for providing a user interface display according to an exemplary embodiment of the present application. The method may be applied in a computer system 100 as shown in fig. 1. The method comprises the following steps:
step 201, a first terminal displays a first user interface of an application program, and a first environment picture and a collaborative display control are displayed in the first user interface.
The method comprises the steps that an application program of a virtual environment is installed in a first terminal, when the application program is operated by the first terminal, a first user interface of the application program is displayed, and a first environment picture and a collaborative display control are displayed in the first user interface. Wherein the first environment screen includes a screen of a first local area determined by a target position of a target virtual object in the virtual environment, the target virtual object being a virtual object that is controlled by a user through a first terminal and is moving in the virtual environment; the collaborative display control is used for prompting a user that an associated screen can be displayed on a display screen of the other terminal, and the associated screen is a screen displayed with the association with the target virtual object.
Illustratively, as shown in fig. 3, a first user interface 112 is displayed in a display screen 111 of the first terminal 110, a first environment screen and a collaborative display control 113 are displayed in the first user interface 112, a target virtual object 114 controlled by a user through the first terminal 110 and a first local area determined according to a target position of the target virtual object 114 are displayed in the first environment screen.
Step 202, when receiving a signal triggered on the cooperative display control, the first terminal sends a request signal through the first short-range wireless communication component.
For example, as shown in fig. 3, a user touches the cooperative display control to generate a signal, and after receiving the signal triggering the cooperative display control, the first terminal 110 sends a request signal to the second terminal 120 through the first short-range wireless communication component, where the request signal is used to request that an associated screen is displayed on the second terminal 120.
Optionally, the application program in the first terminal supports at least one near field wireless transmission function of a Wi-Fi Direct function, a bluetooth Wi-Fi function, and an NFC function.
Exemplarily, when the first near field communication component is a first Wi-Fi component, before the first terminal sends a request signal through the first Wi-Fi component, it is required to ensure that the Wi-Fi Direct function is in an on state, the Wi-Fi Direct function may be automatically turned on when the application program is started, or the Wi-Fi Direct function may be turned on in a setting window of the application program, or the Wi-Fi Direct function may be turned on in a setting window of the first terminal operating system, or the Wi-Fi Direct function may be turned on when a signal triggered on the cooperative display control is received; and when the Wi-Fi Direct function is in an open state, the first terminal sends a request signal through the first Wi-Fi component.
Illustratively, when the first near field communication module is a first bluetooth module, before the first terminal sends the request signal through the first bluetooth module, it is required to ensure that the bluetooth function is in an on state, the bluetooth function may be automatically turned on when the application is started, or the bluetooth function may be turned on in a setup window of the application, or the bluetooth function may be turned on in a setup window of the operating system of the first terminal, or the bluetooth function may be turned on when a signal triggered on the cooperative display control is received; the first terminal obtains the equipment identification of the second terminal through the first Bluetooth assembly in a searching mode, the Bluetooth pairing request is sent to the second terminal through the first Bluetooth assembly according to the equipment identification of the second terminal, and after the pairing receiving signal sent by the second terminal through the second Bluetooth assembly is received, the request signal is sent to the second terminal through the first Bluetooth assembly.
Exemplarily, when the first near field communication component is a first bluetooth component and a first Wi-Fi component, before the first terminal sends a request signal through the first Wi-Fi component, it is required to ensure that a bluetooth function and a Wi-Fi Direct function are in an on state, the bluetooth function and the Wi-Fi Direct function may be automatically turned on when the application program is started, or the bluetooth function and the Wi-Fi Direct function may be turned on in a setup window of the application program, or the bluetooth function and the Wi-Fi Direct function may be turned on in a setup window of an operating system of the first terminal, or the bluetooth function and the Wi-Fi Direct function may be turned on when a signal triggered on the cooperative display control is received; when the Bluetooth function and the Wi-FiDirect function are in the opening state, the first terminal searches through the first Bluetooth assembly to obtain the equipment identification of the second terminal, and sends a request signal to the second terminal through the first Wi-Fi assembly according to the equipment identification of the second terminal.
Illustratively, when the first NFC component is the first NFC component, before the first terminal sends the request signal through the first NFC component, it is required to ensure that the NFC function is in an on state, the NFC function may be automatically turned on when the application is started, or the NFC function may be turned on in a setup window of the application, or the NFC function may be turned on in a setup window of the operating system of the first terminal, or the NFC function may be turned on when a signal triggered on the cooperative display control is received; when the NFC function is in the opening state, the first terminal sends a request signal through the first NFC component.
In step 203, when the second terminal receives the request signal through the second short-range wireless communication module, the second terminal sends a feedback signal to the first terminal through the second short-range wireless communication module, wherein the feedback signal includes the unique identification information.
The unique identification information is the unique identification information corresponding to the application program in the second terminal. The second terminal is provided with the same application program as the first terminal, or the application program installed in the second terminal and the application program operated in the first terminal are the same type of application program of different control system platforms. Optionally, the unique identification information corresponding to the application program in the second terminal includes at least one of a user account, a device identifier, and an IP address of the second terminal.
Optionally, when the second terminal receives the request signal through the second short-range wireless communication module, the sending the feedback signal to the first terminal through the second short-range wireless communication module includes, but is not limited to, the following steps:
step 203a, when the second terminal receives the request signal sent by the first terminal through the first short-range wireless communication component through the second short-range wireless communication component, displaying a request window.
For example, as shown in fig. 3, when the second terminal 220 receives a request signal sent by the first terminal 110 through the first short-range wireless communication component through the second short-range wireless communication component, the request window 122 is displayed on the display screen 121, and if the request signal carries a user a corresponding to the first terminal 110, a "user a initiates a multi-screen self-help application to you", an accept control 123 and a reject control 124 are displayed in the request window 122.
Step 203b, when the second terminal receives the signal triggered on the request window, sending a feedback signal to the first terminal through the second short-range wireless communication component
For example, as shown in fig. 3, the user touches the receiving control 123 to generate a trigger signal, and the second terminal 120 sends a feedback signal to the first terminal 110 through the second short-range wireless communication component after receiving the trigger signal, where the feedback signal carries the unique identification information of the second terminal 120.
The application program running in the second terminal supports at least one near field wireless transmission function of a Wi-Fi Direct function, a Bluetooth Wi-Fi function and an NFC function.
Exemplarily, when the second near field communication component is a second Wi-Fi component, before the second terminal receives the request signal through the second Wi-Fi component, it is required to ensure that the Wi-Fi Direct function is in an on state, the Wi-Fi Direct function may be automatically turned on when the application program is started, or the Wi-Fi Direct function may be turned on in a setting window of the application program, or the Wi-Fi Direct function may be turned on in a setting window of an operating system of the second terminal; and when the Wi-Fi Direct function is in an open state, the second terminal sends a feedback signal to the first terminal through the second Wi-Fi assembly after receiving the request signal sent by the first terminal through the second Wi-Fi assembly.
Illustratively, when the second near field communication module is a second bluetooth module, before the second terminal receives the request signal through the second bluetooth module, it is required to ensure that the bluetooth function is in an on state, and the bluetooth function may be automatically turned on when the application is started, or turned on in a setup window of the application, or turned on in a setup window of an operating system of the second terminal; when the Bluetooth function is in an open state, the second terminal receives a Bluetooth pairing request sent by the first terminal through the first Bluetooth assembly through the second Bluetooth assembly, sends a pairing receiving signal to the first terminal through the second Bluetooth assembly, and sends a feedback signal to the first terminal through the second Bluetooth assembly after receiving a request signal sent by the first terminal through the first Bluetooth assembly through the second Bluetooth assembly.
It should be noted that, in this embodiment of the application, after obtaining the device identifier of the second terminal through the first bluetooth component, the first terminal may send a bluetooth pairing request to the second terminal through the first bluetooth component; or after the second terminal obtains the device identifier of the first terminal through the second bluetooth module, the second terminal sends a bluetooth pairing request to the first terminal through the second bluetooth module, which is not limited herein.
Exemplarily, when the second near field communication component is a second bluetooth component and a second Wi-Fi component, before the second terminal receives the request signal through the second Wi-Fi component, it is required to ensure that the bluetooth function and the Wi-Fi Direct function are in an on state, the bluetooth function and the Wi-Fi Direct function may be automatically turned on when the application program is started, or the bluetooth function and the Wi-Fi Direct function may be turned on at a setup window of the application program, or the bluetooth function and the Wi-Fi Direct function may be turned on at a setup window of an operating system of the second terminal; when the Bluetooth function and the Wi-Fi Direct function are in an open state, after the first terminal searches the equipment identifier of the second terminal through the first Bluetooth assembly, the second terminal receives the request signal sent by the first terminal through the first Wi-Fi assembly through the second Wi-Fi assembly, searches the equipment identifier of the first terminal through the second Bluetooth assembly, and sends a feedback signal to the first terminal through the second Wi-Fi assembly according to the equipment identifier of the first terminal.
Illustratively, when the second NFC component is a second NFC component, before the second terminal receives the request signal through the second NCF component, it is required to ensure that the NFC function is in an on state, and the NFC function may be automatically turned on when the application is started, or the NFC function may be turned on in a setup window of the application, or the NFC function may be turned on in a setup window of an operating system of the second terminal; when the NFC function is in the opening state, the second terminal receives the request signal sent by the first terminal through the first NFC component through the second NFC component and then sends a feedback signal to the first terminal through the second NFC component.
In step 204, the first terminal sends the unique identification information and the information of the target virtual object to the server.
And the first terminal acquires the unique identification information contained in the feedback signal after receiving the feedback signal through the first near field wireless communication assembly, and sends the unique identification information and the information of the target virtual object to the server through a wired or wireless network.
The first terminal sends the unique identification information and the information of the target virtual object to the server, including but not limited to the following steps 204a and 204b, steps 204a and 204c, and steps 204a and 204 d:
and step 204a, after receiving the feedback signal, the first terminal sends the unique identification information to the server.
And after receiving the feedback signal, the first terminal determines to display the associated picture on the second terminal and sends unique identification information to the server, wherein the unique identification information is used for triggering the server to send information corresponding to the associated picture to the second terminal.
And step 204b, the first terminal sends the information of the target virtual object to the server at intervals of a first time interval.
The first terminal sends information of the target virtual object to the server at intervals of a first time interval, wherein the information of the target virtual object is used for assisting the server to generate information corresponding to the associated picture according to the information of the target virtual object.
After receiving the feedback signal, the first terminal sends information of the target virtual object to the server at intervals of a first time interval; or, the step of sending, by the first terminal, the information of the target virtual object to the server at every first time interval does not depend on other steps in the embodiment of the present application.
In step 204c, the first terminal sends the information of the target virtual object to the server based on the event trigger.
Optionally, the trigger event includes an event triggered by a change in the target position of the target virtual object. And when the first terminal receives the trigger event, sending the information of the target virtual object to the server.
After receiving the feedback signal, the first terminal sends information of the target virtual object to the server based on event triggering; or, the step of the first terminal sending the information of the target virtual object to the server based on the event trigger does not depend on other steps in the embodiment.
And step 204d, the first terminal sends the information of the target virtual object to the server based on the event trigger, and the first terminal sends the information of the target virtual object to the server at intervals of a first time interval.
The first terminal sends the information of the target virtual object to the server at intervals of a first time interval, and sends the information of the target virtual object to the server when a trigger event is received. Optionally, the trigger event includes an event triggered by a change in the target position of the target virtual object.
The first terminal may perform step 204d after performing step 204 a; alternatively, step 204d is not dependent on other steps in this embodiment.
In step 205, the server generates information corresponding to the associated screen based on the information of the target virtual object.
The associated screen includes at least one of a second environment screen, a third environment screen and a situation screen.
The second environment picture comprises a picture of a second local area determined by the target position, and the area of the second local area is larger than that of the first local area; the third environment screen includes a screen of the first local area determined by the associated position of the associated virtual object in the virtual environment, the associated virtual object being a virtual object in the same battle as the target virtual object; the situation screen is a screen on which a map having associated information, which is information included in the target virtual object and/or the visual field range of the associated virtual object, is displayed, and the map is a map of the virtual environment.
The following describes a process in which the server acquires information corresponding to the related screen from information of the target virtual object, using a two-dimensional environment as an example. The virtual environment in the embodiment of the present application may be a two-dimensional environment, or may also be a 2.5-dimensional environment or a three-dimensional environment, and the following description is an exemplary description and does not limit the virtual environment.
Illustratively, when the associated screen includes a second environment screen, the information of the target virtual object includes a target position, the server determines a second view frame with the target position as a reference point, obtains a second local area through the second view frame in the virtual environment, and acquires the information of the virtual object included in the second local area as the information of the associated screen.
Illustratively, when the second environment screen includes the third environment screen, the information of the target virtual object includes an account corresponding to the associated virtual object, the server obtains an associated position of the associated virtual object according to the account corresponding to the associated virtual object, determines the associated view frame with the associated position as a reference point, obtains the first local area through the associated view frame in the virtual environment, and obtains the information of the virtual object included in the first local area as the information of the associated screen.
Illustratively, when the second environment picture is a situation picture, the information of the target virtual object includes a target position and an account corresponding to the associated virtual object, the server obtains the associated position of the associated virtual object according to the account of the associated virtual object, determines the second finder frame and the associated finder frame according to the target position and the associated position as reference points, obtains a second local area through the second finder frame in the virtual environment, obtains a first local area through the associated finder frame in the virtual environment, and obtains information of the virtual object included in the second local area and the first local area as information of the associated picture.
And step 206, the server sends information corresponding to the associated picture to the second terminal according to the unique identification information.
And after receiving the unique identification information sent by the first terminal, the server determines the terminal needing to send the information corresponding to the associated picture as a second terminal according to the unique identification information, and sends the information corresponding to the associated picture to the second terminal.
And step 207, the second terminal generates a related picture according to the information corresponding to the related picture.
Illustratively, when the associated screen is a second environment screen, the second terminal generates the second environment screen according to the information of the virtual object contained in the second local area sent by the server. For example, the information of the virtual object included in the second local area includes the position and the identifier of the virtual object included in the second local area, and the second terminal frames the virtual environment through the second frame to obtain the second local area, and then generates the second environment screen according to the information.
Illustratively, when the second environment screen is a third environment screen, the second terminal generates the third environment screen according to the information of the virtual object contained in the first local area transmitted by the server. For example, the information of the virtual object included in the first local area includes the position and the identification of the virtual object included in the first local area, and the second terminal creates the third environment screen based on the information after framing the virtual environment by the associated frame to obtain the first local area.
Illustratively, when the second environment screen is a situation map screen, the second terminal generates the situation map screen based on the information of the virtual object included in the second local area and the first local area transmitted from the server. For example, the information on the virtual object included in the second local area includes the position and the identification of the virtual object included in the second local area, the information on the virtual object included in the first local area includes the position and the identification of the virtual object included in the first local area, and the second terminal generates the situation screen on the map based on the information. Wherein the map is a map of the virtual environment.
And 208, displaying a second user interface of the application program by the second terminal, wherein the second user interface displays an associated picture.
And the second terminal displays a second user interface on the display screen, and the second user interface displays the related picture. For example, taking the associated screen as the second environment screen as an example, as shown in fig. 4, a second user interface 125 is displayed on the display screen 121 of the second terminal 120, and an associated screen is displayed in the second user interface 125, where the associated screen is the second environment screen. Since the area of the second partial region displayed in the second environment screen is larger than the first partial region displayed in the first environment screen, the second environment screen can display "friend a" belonging to the same camp as the target virtual object, and "enemy a" and "enemy B" not belonging to the same camp as the target virtual object, which are not displayed in the first environment screen.
To sum up, in the embodiment of the present application, after the first terminal and the second terminal determine, through respective short-range wireless communication components, that an association screen is displayed on the second terminal, the server sends information corresponding to the association screen to the second terminal, and the second terminal generates and displays the association screen according to the information corresponding to the association screen; because the second terminal near the first terminal displays the related picture related to the target virtual object, more information related to the target virtual object can be displayed through the second terminal, so that a user can obtain more information of the virtual environment, and the use convenience of the application program is improved.
In this embodiment of the application, the virtual environment may be a two-dimensional virtual environment, or may also be a 2.5-dimensional virtual environment or a three-dimensional virtual environment. The following embodiment of fig. 5 will describe a user interface display method proposed in the present application by taking a two-dimensional virtual environment as an example; the embodiment of fig. 7 will describe a user interface display method proposed in the present application by taking a three-dimensional virtual environment as an example.
Fig. 5 is a flowchart illustrating a user interface display method according to an exemplary embodiment of the present application. The method may be applied in a computer system 100 as shown in fig. 1. The method comprises the following steps:
step 501, a first terminal acquires a target position of a target virtual object in a virtual environment.
The first terminal is provided with an application program of a virtual environment, and the target virtual object is an object in the virtual environment controlled by the user through the first terminal.
Step 502, the first terminal frames the virtual environment through a first frame determined by taking the target position as a center, so as to obtain a first local area.
The first terminal takes the target position as the center point of the first view frame, determines the first view frame according to the size of the first view frame, and then views the virtual environment to obtain a first local area.
Illustratively, as shown in fig. 6, the first terminal frames the virtual environment 600 according to the size of the first frame 610 (the length and width of the first frame 610) with the target position 611 as the center, resulting in a first local area 620.
In step 503, the first terminal generates a first environment screen according to the first local area.
And the first terminal generates a first environment picture according to the acquired first local area.
Step 504, the first terminal displays a first user interface of the application program, and a first environment picture and a collaborative display control are displayed in the first user interface.
The first terminal is provided with an application program of a virtual environment, and when the first terminal runs the application program, a first user interface of the application program is displayed.
The step of displaying the first user interface by the first terminal may refer to step 201 in the embodiment of fig. 2, which is not described herein again.
And 505, when receiving the signal triggered on the cooperative display control, the first terminal sends a request signal through the first short-range wireless communication component.
The step of the first terminal sending the request signal to the second terminal through the first short-range wireless communication component when receiving the signal triggered on the cooperative display control may refer to step 202 in the embodiment of fig. 2, which is not described herein again.
In step 506, when the second terminal receives the request signal through the second short-range wireless communication module, the second terminal sends a feedback signal to the first terminal through the second short-range wireless communication module.
The feedback signal comprises unique identification information corresponding to the application program in the second terminal. The second terminal is provided with the same application program as the first terminal, or the application program installed in the second terminal and the application program operated in the first terminal are the same type of application program of different control system platforms.
The step of sending the feedback signal to the first terminal through the second short-range wireless communication component when the second terminal receives the request signal through the second short-range wireless communication component may refer to step 203 in the embodiment of fig. 2, which is not described herein again.
In step 507, the first terminal sends the unique identification information and the information of the target virtual object to the server, wherein the information of the target virtual object comprises a target position.
The step of sending the unique identification information and the information of the target virtual object to the server by the first terminal may refer to step 204 in the embodiment of fig. 2, which is not described herein again.
Step 508, the server frames the virtual environment through a second frame determined by taking the target position as a center, and a second local area is obtained.
And the server takes the target position as the central point of the second viewing frame, determines the second viewing frame according to the size of the second viewing frame, and views the virtual environment to obtain a second local area. The area of the second viewing frame is larger than that of the first viewing frame.
In step 509, the server obtains information of the virtual object included in the second local area.
And the information of the virtual object contained in the second local area comprises the position and the identification of the virtual object.
In step 510, the server sends information of the virtual object included in the second local area to the second terminal according to the unique identification information.
And after receiving the unique identification information sent by the first terminal, the server determines the terminal needing to send information as a second terminal according to the unique identification information, so that the information of the virtual object contained in the second local area is sent to the second terminal.
And 511, the second terminal frames the map through a second viewing frame determined by taking the target position as the center to obtain a second local map.
Wherein the map is a map of the virtual environment. And the second terminal takes the target position as the central point of the second viewing frame, determines the second viewing frame according to the size of the second viewing frame, and then views the map to obtain a second local map. Since the target virtual object is located at the center of the second local area, the information of the virtual object contained within the second local area includes the target position.
In step 512, the second terminal generates a virtual object on the second local map according to the information of the virtual object included in the second local area.
Illustratively, the second terminal stores therein a map, an avatar of each type of target virtual object, and a correspondence between the identity and the avatar of each type of virtual object.
The second terminal inquires the corresponding relation according to the identification of the virtual object contained in the second local area to determine the image of the virtual object to be displayed; and determining the position of the virtual object to be displayed according to the position of the virtual object contained in the second local area, so as to generate the virtual object on the second local map.
In step 513, the second terminal generates a second environment screen according to the second local map and the virtual object in the second local map.
And the second terminal generates a second environment picture according to the acquired second local map and the virtual object generated in the second local map.
As shown in fig. 6, displayed in the first user interface of the first terminal 110 is a first partial area selected by the first viewfinder 610 to be photographed in the virtual environment 600; displayed in the second user interface of the second terminal 120 is a second partial area selected in the virtual environment 600 through the second viewing frame 620 and the target position of the target virtual object 630. Since the area of the first view frame 610 is smaller than that of the second view frame 620, the target virtual object 630 is displayed in a smaller proportion in the second terminal 630 than in the first terminal 110.
And 514, displaying a second user interface of the application program by the second terminal, wherein a second environment picture is displayed in the second user interface.
And the second terminal displays a second user interface on the display screen, and the second user interface displays the generated second environment picture. Illustratively, as shown in fig. 4, a second user interface 125 is displayed on the display screen 121 of the second terminal 120, and a second environment screen is displayed in the second user interface 125.
To sum up, in the embodiment of the present application, after the first terminal and the second terminal determine, through respective short-range wireless communication components, that an association screen is displayed on the second terminal, the server sends information corresponding to the association screen to the second terminal, and the second terminal generates and displays the association screen according to the information corresponding to the association screen; because the second terminal near the first terminal displays the related picture related to the target virtual object, more information related to the target virtual object can be displayed through the second terminal, so that a user can obtain more information of the virtual environment, and the use convenience of the application program is improved.
Fig. 7 is a flowchart illustrating a user interface display method according to an exemplary embodiment of the present application. The method may be applied in a computer system 100 as shown in fig. 1. The method comprises the following steps:
step 701, a first terminal acquires a target position of a target virtual object in a virtual environment.
The first terminal has an application program installed therein for a virtual environment, the target virtual object is an object in the virtual environment controlled by the user through the first terminal, and the target position may be a target coordinate of the target virtual object in the virtual environment.
Step 702, the first terminal determines a first observation position of the first observation point in the virtual environment according to the target position.
The first observation position of the first observation point in the virtual environment is related to the target position of the target virtual object corresponding to the first terminal in the virtual environment. Illustratively, the target coordinates of the target virtual object in the virtual environment are (x1, y1, z1), the first viewpoint is a point disposed diagonally behind the target virtual object, and the difference in coordinates between the first viewpoint and the target virtual object is (0, — Δ y1,. Δ z 1). The first terminal calculates a first coordinate of the first viewpoint as (x1, y1- Δ y1, z1+ Δ z1) according to the coordinates of the target virtual object, thereby determining a first viewing position of the first viewpoint. The target coordinates of the virtual object in the virtual environment may be reference point coordinates of the virtual object, and the reference point may be a pixel point at a preset position of the virtual object, where the preset position may be an eye, a vertex, a left shoulder, and the like of the virtual object.
In step 703, the first terminal observes the virtual environment at a first observation position through a first viewing angle to obtain a first local area.
The first view point corresponds to a first view angle, and the first view angle has a view angle direction and a view angle size. Illustratively, as shown in fig. 8, the first observation point P1 is located diagonally behind the target virtual object 800, and the first terminal obtains the first local area by simulating in the virtual environment, observing the virtual environment through the first viewing angle α 1 at the first observation position where the first observation point P1 is located.
In step 704, the first terminal generates a first environment screen according to the first local area.
And the first terminal generates a first environment picture according to the acquired first local area.
Step 705, the first terminal displays a first user interface of the application program, and a first environment picture and a collaborative display control of the virtual environment are displayed in the first user interface.
The first terminal is provided with an application program of a virtual environment, and when the application program is operated by the first terminal, a first user interface is displayed, and a first environment picture and a collaborative display control are displayed in the first user interface. And the collaborative display control is used for prompting the user that a second environment picture can be displayed on the display screen of the other terminal.
Illustratively, as shown in fig. 9, a first user interface 112 is displayed in the display screen 111 of the first terminal 110, a first environment screen and a collaborative display control 113 are displayed in the first user interface 112, a target virtual object 114 controlled by the user through the first terminal 110 and a screen of a first local area of the virtual environment observed by a first viewpoint corresponding to the target virtual object 114 are displayed in the first environment screen.
In step 706, when receiving the signal triggered on the cooperative display control, the first terminal sends a request signal through the first short-range wireless communication component.
The step of the first terminal sending the request signal to the second terminal through the first short-range wireless communication component when receiving the signal triggered on the cooperative display control may refer to step 202 in the embodiment of fig. 2, which is not described herein again.
And step 707, when the second terminal receives the request signal through the second short-range wireless communication component, the second terminal sends a feedback signal to the first terminal through the second short-range wireless communication component.
The feedback signal comprises unique identification information corresponding to the application program in the second terminal. The step of sending the feedback signal to the first terminal through the second short-range wireless communication component when the second terminal receives the request signal through the second short-range wireless communication component may refer to step 203 in the embodiment of fig. 2, which is not described herein again.
In step 708, the first terminal sends the unique identification information and information of the target virtual object to the server, the information of the target virtual object including the target location.
The step of sending the unique identification information and the information of the target virtual object to the server by the first terminal may refer to step 204 in the embodiment of fig. 2, which is not described herein again.
In step 709, the server determines a second observation position of the second observation point in the virtual environment according to the target position.
The second viewpoint's second viewing position in the virtual environment is associated with the target position of the target virtual object. Illustratively, the target coordinates of the target virtual object in the virtual environment are (x1, y1, z1), the second viewpoint is arranged obliquely behind the target virtual object, and the difference value between the coordinates of the second viewpoint and the target virtual object is (0, - [ delta ] y2, [ delta ] z2), so that the server calculates the second coordinates of the second viewpoint as (x1, y1- [ delta ] y2, z1+ [ delta ] z2) according to the coordinates of the target virtual object, thereby determining the second observation position of the second viewpoint. The target coordinates of the virtual object in the virtual environment may be reference point coordinates of the virtual object, and the reference point may be a pixel point at a preset position of the virtual object, where the preset position may be an eye, a vertex, a left shoulder, and the like of the virtual object.
Step 710, the server observes the virtual environment at a second observation position through a second viewing angle to obtain a second local area.
The second viewing point corresponds to a second viewing angle, the second viewing angle having a viewing angle direction and a viewing angle size. Illustratively, as shown in fig. 8, the second viewpoint P2 is located diagonally behind the target virtual object 800, and the server simulates in the virtual environment that the virtual environment is viewed from the second viewing position at which the second viewpoint P2 is located by the second viewing angle α 2 to obtain the second local area.
In step 711, the server obtains information of the virtual object included in the second local area.
And the information of the virtual object contained in the second local area comprises the position and the identification of the virtual object.
In step 712, the server transmits information of the virtual object included in the second local area to the second terminal based on the unique identification information.
And after receiving the unique identification information sent by the first terminal, the server determines the terminal needing to send information as a second terminal according to the unique identification information, so that the information of the virtual object contained in the second local area is sent to the second terminal.
Step 713, the second terminal determines a second observation position of a second observation point corresponding to the target virtual object in the background space of the virtual environment according to the target position.
Since the target virtual object is located at the center of the second local area, the information of the virtual object contained within the second local area includes the target position. The background space of the virtual environment is a virtual environment that does not include virtual objects. For example, the virtual environment includes islands, oceans, and virtual objects that move on the islands, and the background space is the islands and oceans in the virtual environment.
And the second terminal determines a second observation position of a second observation point corresponding to the target virtual object in the background space of the virtual environment according to the target position. Illustratively, the second terminal calculates a second coordinate of a second observation position where the second observation point is located according to the target coordinate and a coordinate difference between the second observation point and the target virtual object, and observes the background space of the virtual environment through a second viewing angle α 2 at the second observation position where the second coordinate is located to determine the local background space.
And 714, the second terminal observes the background space according to the second observation position and the second viewing angle of the second observation point to obtain a local background space.
Wherein the local background space is a second local area that does not include virtual objects.
In step 715, the second terminal generates a virtual object in the local background space according to the information of the virtual object included in the second local area.
Illustratively, the background space of the virtual environment, the model of each type of virtual object, and the corresponding relationship between the identifier of each type of virtual object and the model are stored in the second terminal.
The second terminal inquires the corresponding relation according to the identification of the virtual object contained in the second local area to determine a model of the virtual object to be displayed; and determining the position of the virtual object to be displayed according to the coordinates of the virtual object contained in the second local area, so as to generate the virtual object on the local background space.
Step 716, the second terminal generates a second environment picture according to the local background space and the virtual object in the local background space.
And the second terminal generates a second environment picture according to the obtained local background space and the virtual object generated in the local background space.
Step 717, the second terminal displays a second user interface of the application program, and a second environment picture is displayed in the second user interface.
Illustratively, as shown in fig. 9, a first user interface 112 is displayed on a display screen 111 of the first terminal 110, a first environment screen is displayed on the first user interface 112, a target virtual object 114 is displayed on the second environment screen, and a first viewpoint corresponding to the target virtual object 114 observes a screen of a first local area of the virtual environment; a second user interface 122 is displayed on the display screen 121 of the second terminal 120, a second environment screen is displayed on the second user interface 122, the target virtual object 114 is displayed on the second environment screen, and a second observation point corresponding to the target virtual object 114 observes a screen of a second local area of the virtual environment.
To sum up, in the embodiment of the present application, after the first terminal and the second terminal determine, through respective short-range wireless communication components, that an association screen is displayed on the second terminal, the server sends information corresponding to the association screen to the second terminal, and the second terminal generates and displays the association screen according to the information corresponding to the association screen; because the second terminal near the first terminal displays the related picture related to the target virtual object, more information related to the target virtual object can be displayed through the second terminal, so that a user can obtain more information of the virtual environment, and the use convenience of the application program is improved.
Fig. 10 is a block diagram illustrating a user interface display apparatus provided in an exemplary embodiment of the present application. The apparatus may be implemented as the first terminal 110 in the embodiment of fig. 1 by software, hardware or a combination of both. The device includes: a display module 1010, a first short range wireless communication module 1020, and a transmission module 1030.
The display module 1010 is configured to display a first user interface of an application, where the first user interface displays a first environment screen and a collaborative display control, and the first environment screen includes a screen of a first local area determined by a target position of a target virtual object in a virtual environment.
A first short-range wireless communication module 1020, configured to send a request signal through the first short-range wireless communication component when receiving a signal triggered on the cooperative display control, where the request signal is used to request an application in the second terminal to display an associated screen, where the associated screen is a screen displayed with a relationship with the target virtual object; and receiving a feedback signal sent by the second terminal through the second short-distance wireless communication component through the first short-distance wireless communication component, wherein the feedback signal contains unique identification information corresponding to an application program in the second terminal.
A sending module 1030, configured to send unique identification information and information of the target virtual object to the server, where the unique identification information is used to trigger the server to send information corresponding to the associated screen to the second terminal; and the information of the target virtual object is used for the auxiliary server to acquire the information corresponding to the associated picture according to the information of the target virtual object.
In an optional embodiment, the sending module 1030 is further configured to send the unique identification information to the server after receiving the feedback signal; and sending the information of the target virtual object to the server at preset time intervals and/or sending the information of the target virtual object to the server based on event triggering.
In an alternative embodiment, the first short-range wireless communication module comprises a first Bluetooth module and the second short-range wireless communication module comprises a second Bluetooth module;
the first short-distance wireless communication module 1020 is further configured to search through the first bluetooth module to obtain the device identifier of the second terminal; sending a Bluetooth pairing request to a second terminal through a first Bluetooth assembly according to the equipment identifier; and after receiving a pairing receiving signal sent by the second terminal through the second Bluetooth assembly, sending a request signal to the second terminal through the first Bluetooth assembly.
In an alternative embodiment, the first near field wireless communication assembly includes a first Wi-Fi assembly and a second Bluetooth assembly, and the second near field wireless communication assembly includes a second Wi-Fi assembly and a second Bluetooth assembly;
the first short-distance wireless communication module 1020 is further configured to search through the first bluetooth module to obtain the device identifier of the second terminal; and after the equipment identification of the second terminal is searched, sending a request signal to the second terminal through the first Wi-Fi assembly, wherein the request signal is used for assisting the second terminal to receive the request signal through the second Wi-Fi assembly.
In an optional embodiment, the associated screen includes at least one of a second environment screen, a third environment screen, and a situation screen; the second environment picture comprises a picture of a second local region determined by the target position, and the area of the second local region is larger than that of the first local region; the third environment screen includes a screen of the first local area determined by the associated position of the associated virtual object in the virtual environment, the associated virtual object being an object in the same battle as the target virtual object; the situation screen is a screen on which a map having associated information, which is information included in the target virtual object and/or the visual field range of the associated virtual object, is displayed, and the map is a map of the virtual environment.
In an alternative embodiment, the virtual environment is a two-dimensional environment, and the associated screen comprises a second environment screen; the first local area is an area obtained by framing the virtual environment with a first framing frame determined by taking the target position as the center; the second local area is an area obtained by framing the virtual environment through a second framing frame determined by taking the target position as the center; the area of the second viewing frame is larger than that of the first viewing frame.
In an alternative embodiment, the information of the target virtual object includes a target location; the information corresponding to the associated picture comprises information of a virtual object contained in a second local area obtained after the server frames the virtual environment through a second viewing frame.
In an alternative embodiment, the virtual environment is a 2.5-dimensional environment or a three-dimensional environment, and the associated screen comprises a second environment screen; the first local area is an area obtained by observing the virtual environment with a first observation point corresponding to the target virtual object; the second local area is an area obtained by observing the virtual environment through a second observation point corresponding to the target virtual object; wherein the visual field range of the second observation point is larger than that of the first observation point.
In an alternative embodiment, the information of the target virtual object includes a target location; the information corresponding to the associated picture comprises a second observation position of a second observation point determined by the server according to the target position, and information obtained after a virtual environment is observed according to the second observation position and a second visual angle to obtain a second local area; wherein the height of the second viewing position is higher than the height of the first viewing position and/or the second viewing angle is greater than the first viewing angle.
Fig. 11 is a block diagram illustrating a user interface display apparatus provided in an exemplary embodiment of the present application. The apparatus may be implemented as the second terminal 120 in the embodiment of fig. 1 by software, hardware or a combination of both. The device includes: a second short-range wireless communication module 1110, a receiving module 1120, a processing module 1130, and a display module 1140.
A second short-range wireless communication module 1110, configured to receive, through a second short-range wireless communication component, a request signal sent by the first terminal through the first short-range wireless communication component; and sending a feedback signal to the first terminal through the second near field wireless communication assembly, wherein the feedback signal carries unique identification information of an application program in the second terminal, and the unique identification information is used for triggering the server to send information corresponding to the associated picture.
The receiving module 1120 is configured to receive information corresponding to the associated screen sent by the server.
The processing module 1130 is configured to generate the associated screen according to the information corresponding to the associated screen.
The display module 1140 is configured to display a second user interface of the application, where the second user interface displays an associated screen.
In an alternative embodiment, the display module 1140 is further configured to display a request window according to the request signal, where the request window is used to request to display the associated screen.
The second short-range wireless communication module 1110 is further configured to send a feedback signal to the first terminal through the second short-range wireless communication component when receiving the permission signal triggered on the request window.
In an optional embodiment, the associated screen includes a second environment screen, the second environment screen includes a screen of a second local area determined by a target position of the target virtual object in the virtual environment, and the information corresponding to the associated screen includes information of the virtual object included in the second local area; the virtual environment is a two-dimensional environment;
the processing module 1130 is further configured to perform framing on the map by using the second framing frame determined by taking the target position as the center to obtain a second local map, where the map is a map of the virtual environment, and the second local map is a map of a second local area; generating a virtual object on the second local map according to information of the virtual object contained in the second local area; and generating a related picture according to the second local map and the virtual object in the second local map.
In an optional embodiment, the associated screen includes a second environment screen, the second environment screen includes a screen of a second local area determined by a target position of the target virtual object in the virtual environment, and the information corresponding to the associated screen includes information of the virtual object included in the second local area; the virtual environment is a 2.5-dimensional environment or a three-dimensional environment;
the processing module 1130 is further configured to determine, according to the target position, a second observation position of a second observation point corresponding to the target virtual object in the background space of the virtual environment; observing the background space according to the second observation position and a second visual angle of the second observation point to obtain a local background space, wherein the local background space is the background of the second local area; generating a virtual object on the local background space according to the information of the virtual object contained in the second local area; and generating the associated picture according to the local background space and the virtual object in the local background space.
Fig. 12 is a block diagram illustrating a user interface display apparatus provided in an exemplary embodiment of the present application. The apparatus may be implemented as the server 130 in the embodiment of fig. 1 by software, hardware or a combination of both. The device includes: the receiving module 1210, the processing module 1220, and the transmitting module 1230: .
A receiving module 1210, configured to receive information of a target virtual object and unique identification information sent by a first terminal; the target virtual object is a virtual object in an application program running in the first terminal, and the unique identification information is identification information corresponding to the application program in the second terminal.
The processing module 1220 is configured to obtain information corresponding to an associated screen according to the information of the target virtual object, where the associated screen is a screen displayed in association with the target virtual object.
A sending module 1230, configured to send information corresponding to the associated screen to an application program in the second terminal according to the unique identification information, where the information corresponding to the associated screen is used to assist the second terminal in generating the associated screen.
In an optional embodiment, the association screen includes a second environment screen, the second environment screen includes a screen of a second local area determined by a target position of the target virtual object in the virtual environment, the information corresponding to the association screen includes information of the virtual object included in the second local area, and the information of the target virtual object includes the target position of the target virtual object in the virtual environment; the virtual environment is a two-dimensional environment;
the processing module 1220 is further configured to perform framing on the virtual environment by using the second framing frame determined by taking the target position as a center to obtain a second local area; information of the virtual object contained in the second local area is acquired.
In an optional embodiment, the association screen includes a second environment screen, the second environment screen includes a screen of a second local area determined by a target position of the target virtual object in the virtual environment, the information corresponding to the association screen includes information of the virtual object included in the second local area, and the information of the target virtual object includes the target position of the target virtual object in the virtual environment; the virtual environment is a 2.5-dimensional environment or a three-dimensional environment;
the processing module 1220 is further configured to determine a second observation position of a second observation point corresponding to the target virtual object according to the target position; observing the virtual environment according to a second observation position and a second visual angle to obtain a second local area; information of the virtual object contained in the second local area is acquired.
Fig. 13 shows a block diagram of a terminal 1300 according to an exemplary embodiment of the present application. The terminal 1300 may be a portable mobile terminal such as: smart phones, tablet computers, MP3 players (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), MP4 players (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4). Terminal 1300 may also be referred to by other names such as user equipment, portable terminal, etc.
In general, terminal 1300 includes: a processor 1301 and a memory 1302.
Processor 1301 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 1301 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1301 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1301 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing content that the display screen needs to display. In some embodiments, processor 1301 may further include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
The memory 1302 may include one or more computer-readable storage media, which may be tangible and non-transitory. The memory 1302 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1302 is used to store at least one instruction for execution by processor 1301 to implement the user interface display method provided herein for execution by a first terminal or a second terminal.
In some embodiments, terminal 1300 may further optionally include: a peripheral interface 1303 and at least one peripheral. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1304, touch display 1305, camera 1306, audio circuitry 1307, positioning component 1308, and power supply 1309.
Peripheral interface 1303 may be used to connect at least one peripheral associated with I/O (Input/Output) to processor 1301 and memory 1302. In some embodiments, processor 1301, memory 1302, and peripheral interface 1303 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1301, the memory 1302, and the peripheral device interface 1303 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 1304 is used to receive and transmit RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1304 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1304 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1304 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 1304 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or Wi-Fi (Wireless Fidelity) networks. In this embodiment, the radio frequency circuit 1304 may further include at least one of a Wi-Fi component, a bluetooth component, and an NFC component.
The touch display 1305 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. The touch display 1305 also has the capability to collect touch signals on or over the surface of the touch display 1305. The touch signal may be input to the processor 1301 as a control signal for processing. The touch display 1305 is used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, touch display 1305 may be one, providing the front panel of terminal 1300; in other embodiments, touch display 1305 may be at least two, either on different surfaces of terminal 1300 or in a folded design; in still other embodiments, touch display 1305 may be a flexible display disposed on a curved surface or on a folded surface of terminal 1300. Even more, the touch screen 1305 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The touch Display 1305 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), or the like.
The camera assembly 1306 is used to capture images or video. Optionally, camera assembly 1306 includes a front camera and a rear camera. Generally, a front camera is used for realizing video call or self-shooting, and a rear camera is used for realizing shooting of pictures or videos. In some embodiments, the number of the rear cameras is at least two, and each of the rear cameras is any one of a main camera, a depth-of-field camera and a wide-angle camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting function and a VR (Virtual Reality) shooting function. In some embodiments, camera assembly 1306 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1307 is used to provide an audio interface between the user and the terminal 1300. The audio circuit 1307 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1301 for processing, or inputting the electric signals to the radio frequency circuit 1304 for realizing voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided, each at a different location of terminal 1300. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1301 or the radio frequency circuitry 1304 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuitry 1307 may also include a headphone jack.
The positioning component 1308 is used for positioning the current geographic position of the terminal 1300 for implementing navigation or LBS (Location Based Service). The Positioning component 1308 can be a Positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, or the galileo System in russia.
Power supply 1309 is used to provide power to various components in terminal 1300. The power source 1309 may be alternating current, direct current, disposable or rechargeable. When the power source 1309 comprises a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 1300 also includes one or more sensors 1310. The one or more sensors 1310 include, but are not limited to: acceleration sensor 1311, gyro sensor 1312, pressure sensor 1313, fingerprint sensor 1314, optical sensor 1315, and proximity sensor 1316.
The acceleration sensor 1311 can detect the magnitude of acceleration on three coordinate axes of the coordinate system established with the terminal 1300. For example, the acceleration sensor 1311 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 1301 may control the touch display screen 1305 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1311. The acceleration sensor 1311 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1312 may detect the body direction and the rotation angle of the terminal 1300, and the gyro sensor 1312 may cooperate with the acceleration sensor 1311 to acquire a 3D motion of the user with respect to the terminal 1300. Processor 1301, based on the data collected by gyroscope sensor 1312, may perform the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensor 1313 may be disposed on a side bezel of terminal 1300 and/or underlying touch display 1305. When the pressure sensor 1313 is provided on the side frame of the terminal 1300, a user's grip signal on the terminal 1300 can be detected, and left-right hand recognition or shortcut operation can be performed based on the grip signal. When the pressure sensor 1313 is disposed on the lower layer of the touch display 1305, it is possible to control an operability control on the UI interface according to a pressure operation of the user on the touch display 1305. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1314 is used for collecting the fingerprint of the user to identify the identity of the user according to the collected fingerprint. When the identity of the user is identified as a trusted identity, the processor 1301 authorizes the user to perform relevant sensitive operations, including unlocking a screen, viewing encrypted information, downloading software, paying, changing settings, and the like. The fingerprint sensor 1314 may be disposed on the front, back, or side of the terminal 1300. When a physical button or vendor Logo is provided on the terminal 1300, the fingerprint sensor 1314 may be integrated with the physical button or vendor Logo.
The optical sensor 1315 is used to collect the ambient light intensity. In one embodiment, the processor 1301 can control the display brightness of the touch display screen 1305 according to the intensity of the ambient light collected by the optical sensor 1315. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1305 is increased; when the ambient light intensity is low, the display brightness of the touch display 1305 is turned down. In another embodiment, the processor 1301 can also dynamically adjust the shooting parameters of the camera assembly 1306 according to the ambient light intensity collected by the optical sensor 1315.
Proximity sensor 1316, also known as a distance sensor, is typically disposed on a front face of terminal 1300. Proximity sensor 1316 is used to gather the distance between the user and the front face of terminal 1300. In one embodiment, the processor 1301 controls the touch display 1305 to switch from the bright screen state to the dark screen state when the proximity sensor 1316 detects that the distance between the user and the front face of the terminal 1300 gradually decreases; the touch display 1305 is controlled by the processor 1301 to switch from the rest state to the bright state when the proximity sensor 1316 detects that the distance between the user and the front face of the terminal 1300 gradually becomes larger.
Those skilled in the art will appreciate that the configuration shown in fig. 13 is not intended to be limiting with respect to terminal 1300 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be employed.
Fig. 14 is a schematic structural diagram of a computer device according to an exemplary embodiment of the present application. The computer device may be the server 140 in the embodiment of fig. 1. Specifically, the method comprises the following steps: the computer device 1400 includes a Central Processing Unit (CPU)1401, a system memory 1404 including a Random Access Memory (RAM)1402 and a Read Only Memory (ROM)1403, and a system bus 1405 connecting the system memory 1404 and the central processing unit 1401. The computer device 1400 also includes a basic input/output system (I/O system) 1406 that facilitates transfer of information between various components within the computer, and a mass storage device 1407 for storing an operating system 1413, application programs 1412 and other program modules 1415.
The basic input/output system 1406 includes a display 1408 for displaying information and an input device 1409, such as a mouse, keyboard, etc., for user input of information. Wherein the display 1408 and input device 1409 are both connected to the central processing unit 1401 via an input-output controller 1410 connected to the system bus 1405. The basic input/output system 1406 may also include an input/output controller 1410 for receiving and processing input from a number of other devices, such as a keyboard, mouse, or electronic stylus. Similarly, input-output controller 1410 also provides output to a display screen, a printer, or other type of output device.
The mass storage device 1407 is connected to the central processing unit 1401 through a mass storage controller (not shown) connected to the system bus 1405. The mass storage device 1407 and its associated computer-readable storage media provide non-volatile storage for the computer device 1400. That is, the mass storage device 1407 may include a computer readable storage medium (not shown) such as a hard disk or a CD-ROI drive.
Without loss of generality, the computer-readable storage media may include computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, DVD, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices. Of course, those skilled in the art will appreciate that the computer storage media is not limited to the foregoing. The system memory 1404 and mass storage device 1407 described above may collectively be referred to as memory.
The memory stores one or more programs configured to be executed by the one or more central processing units 1401, the one or more programs containing instructions for implementing the scheduling method of the virtual object in the virtual environment described above, and the central processing unit 1401 executes the one or more programs to implement the user interface display method provided by the various method embodiments described above.
According to various embodiments of the present application, the computer device 1400 may also operate as a remote computer connected to a network via a network, such as the Internet. That is, the computer device 1400 may be connected to the network 1412 through the network interface unit 1411 connected to the system bus 1405, or may be connected to other types of networks or remote computer systems (not shown) using the network interface unit 1411.
The memory also includes one or more programs, which are stored in the memory, the one or more programs including instructions for performing the steps performed by the server in the user interface method provided by the embodiments of the present invention.
The present invention also provides a computer-readable storage medium, in which at least one instruction, at least one program, a code set, or a set of instructions is stored, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by the processor to implement the user interface display method according to any one of the above embodiments.
The present application also provides a computer program product, which when running on a computer, causes the computer to execute the user interface display method provided by the above method embodiments.
It should be understood that reference to "a plurality" herein means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (26)

1. A user interface display method, performed by a first terminal, the method comprising:
displaying a first user interface of an application program, wherein a first environment picture and a collaborative display control are displayed in the first user interface, and the first environment picture comprises a picture of a first local area determined by a target position of a target virtual object in a virtual environment;
when a signal triggered on the cooperative display control is received, sending a request signal through a first short-distance wireless communication component, wherein the request signal is used for requesting the application program in the second terminal to display an associated picture, and the associated picture is a picture displayed with the target virtual object;
receiving a feedback signal sent by a second terminal through a second short-distance wireless communication component through the first short-distance wireless communication component, wherein the feedback signal comprises unique identification information corresponding to the application program in the second terminal;
sending the unique identification information and the information of the target virtual object to a server, wherein the unique identification information is used for triggering the server to send the information corresponding to the associated picture to the second terminal; and the information of the target virtual object is used for assisting the server to acquire the information corresponding to the associated picture according to the information of the target virtual object.
2. The method of claim 1, wherein sending the unique identification information and the information of the target virtual object to a server comprises:
after receiving the feedback signal, sending the unique identification information to the server;
and sending the information of the target virtual object to the server at preset time intervals and/or sending the information of the target virtual object to the server based on event triggering.
3. The method of claim 1, wherein the first short-range wireless communication component comprises a first Bluetooth component and the second short-range wireless communication component comprises a second Bluetooth component;
the transmitting of the request signal by the first short-range wireless communication component includes:
searching through the first Bluetooth assembly to obtain the equipment identifier of the second terminal;
sending a Bluetooth pairing request to the second terminal through the first Bluetooth component according to the equipment identifier;
and after receiving a pairing accepting signal sent by the second terminal through the second Bluetooth assembly, sending the request signal to the second terminal through the first Bluetooth assembly.
4. The method of claim 2, wherein the first close-range wireless communication component comprises a first Wi-Fi component and a first bluetooth component, and wherein the second close-range wireless communication component comprises a second Wi-Fi component and a second bluetooth component;
the transmitting of the request signal by the first short-range wireless communication component includes:
searching through the first Bluetooth assembly to obtain the equipment identifier of the second terminal;
and after the equipment identification of the second terminal is searched, sending the request signal to the second terminal through the first Wi-Fi component, wherein the request signal is used for assisting the second terminal to receive the request signal through the second Wi-Fi component.
5. The method according to any one of claims 1 to 4, wherein the associated screen comprises at least one of a second environment screen, a third environment screen and a situation screen;
the second environment picture comprises a picture of a second local region determined with the target position, and the area of the second local region is larger than that of the first local region;
the third environment screen includes a screen of a first local area determined by a correlation position of a correlation virtual object in the virtual environment, the correlation virtual object being an object in the same battle as the target virtual object;
the situation screen is a screen on which a map having associated information is displayed, the associated information is information included in the target virtual object and/or a visual field range of the associated virtual object, and the map is a map of the virtual environment.
6. The method of claim 5, wherein the virtual environment is a two-dimensional environment, and wherein the associated screen comprises the second environment screen;
the first local area is an area obtained by framing the virtual environment with a first framing frame determined by centering on the target position;
the second local area is an area obtained by framing the virtual environment through a second framing frame determined by taking the target position as a center;
the area of the second viewing frame is larger than that of the first viewing frame.
7. The method of claim 6, wherein the information of the target virtual object includes the target location;
the information corresponding to the associated picture comprises information of a virtual object contained in the second local area obtained after the server performs framing on the virtual environment through the second framing frame to obtain the second local area.
8. The method of claim 5, wherein the virtual environment is a 2.5-dimensional environment or a three-dimensional environment, and wherein the associated screen comprises the second environment screen;
the first local area is an area obtained by observing the virtual environment with a first observation point corresponding to the target virtual object;
the second local area is an area obtained by observing the virtual environment with a second observation point corresponding to the target virtual object;
wherein the field of view of the second viewpoint is greater than the field of view of the first viewpoint.
9. The method of claim 8, wherein the information of the target virtual object includes the target location;
the information corresponding to the associated picture comprises a second observation position of the second observation point determined by the server according to the target position, and information obtained after the virtual environment is observed according to the second observation position and a second visual angle to obtain a second local area;
and the height of the second observation position is higher than that of the first observation position, and/or the second observation angle is larger than the first observation angle, and the first observation position is the observation position of the first observation point in the virtual environment, which is determined according to the target position.
10. A user interface display method, performed by a second terminal, the method comprising:
receiving a request signal sent by the first terminal through the first short-distance wireless communication component through the second short-distance wireless communication component;
sending a feedback signal to the first terminal through the second short-distance wireless communication assembly, wherein the feedback signal carries unique identification information of an application program in the second terminal, and the unique identification information is used for triggering a server to send information corresponding to an associated picture;
receiving information corresponding to the associated picture sent by the server;
generating the associated picture according to the information corresponding to the associated picture;
and displaying a second user interface of the application program, wherein the associated picture is displayed in the second user interface.
11. The method of claim 10, wherein said sending a feedback signal to the first terminal via the second short-range wireless communication component comprises:
displaying a request window according to the request signal, wherein the request window is used for requesting to display a related picture;
and when receiving a permission signal triggered on the request window, sending the feedback signal to the first terminal through the second short-range wireless communication component.
12. The method according to claim 10 or 11, wherein the associated screen comprises a second environment screen, the second environment screen comprises a screen of a second local area determined by a target position of a target virtual object in a virtual environment, and the information corresponding to the associated screen comprises information of the virtual object contained in the second local area;
the virtual environment is a two-dimensional environment;
the generating the associated picture according to the information corresponding to the associated picture comprises:
framing a map by using a second framing frame determined by taking the target position as a center to obtain a second local map, wherein the map is of the virtual environment, and the second local map is of the second local area;
generating a virtual object on the second local map according to information of the virtual object contained in the second local area;
and generating the association screen according to the second local map and the virtual object in the second local map.
13. The method according to claim 10 or 11, wherein the associated screen comprises a second environment screen, the second environment screen comprises a screen of a second local area determined by a target position of a target virtual object in a virtual environment, and the information corresponding to the associated screen comprises information of the virtual object contained in the second local area;
the virtual environment is a 2.5-dimensional environment or a three-dimensional environment;
the generating the associated picture according to the information corresponding to the associated picture comprises:
determining a second observation position of a second observation point corresponding to the target virtual object in a background space of the virtual environment according to the target position;
observing the background space according to the second observation position and a second visual angle of the second observation point to obtain a local background space, wherein the local background space is the background of the second local area;
generating the virtual object on the local background space according to the information of the virtual object contained in the second local area;
and generating the associated picture according to the local background space and the virtual object in the local background space.
14. A user interface display method, the method being performed by a server, the method comprising:
receiving information and unique identification information of a target virtual object sent by a first terminal; the target virtual object is a virtual object in an application program running in the first terminal, and the unique identification information is identification information corresponding to the application program in the second terminal;
acquiring information corresponding to an associated picture according to the information of the target virtual object, wherein the associated picture is a picture associated with the target virtual object;
and sending information corresponding to the associated picture to the application program in the second terminal according to the unique identification information, wherein the information corresponding to the associated picture is used for assisting the second terminal to generate the associated picture.
15. The method according to claim 14, wherein the associated screen comprises a second environment screen, the second environment screen comprises a screen of a second local area determined by a target position of a target virtual object in a virtual environment, the information corresponding to the associated screen comprises information of the virtual object contained in the second local area, and the information of the target virtual object comprises the target position of the target virtual object in the virtual environment;
the virtual environment is a two-dimensional environment;
the acquiring information corresponding to the associated picture according to the information of the target virtual object includes:
framing the virtual environment by using a second framing frame determined by taking the target position as a center to obtain a second local area;
and acquiring information of the virtual object contained in the second local area.
16. The method according to claim 14, wherein the associated screen comprises a second environment screen, the second environment screen comprises a screen of a second local area determined by a target position of a target virtual object in a virtual environment, the information corresponding to the associated screen comprises information of the virtual object contained in the second local area, and the information of the target virtual object comprises the target position of the target virtual object in the virtual environment;
the virtual environment is a 2.5-dimensional environment or a three-dimensional environment;
the acquiring information corresponding to the associated picture according to the information of the target virtual object includes:
determining a second observation position of a second observation point corresponding to the target virtual object according to the target position;
observing the virtual environment according to the second observation position and a second visual angle to obtain a second local area;
and acquiring information of the virtual object contained in the second local area.
17. A user interface display method, performed by a second terminal, the method comprising:
displaying a user interface of an application;
displaying a request window on the user interface, wherein the request window is a window in which an application program in the first terminal requests to display an associated picture, and an acceptance control is displayed in the request window;
and when a trigger signal triggered on the receiving control is received, displaying a second user interface, wherein the associated picture is displayed in the second user interface, and the associated picture displays a picture associated with a target virtual object, and the target virtual object is a virtual object which is active in a virtual environment and is controlled by the first terminal.
18. The method according to claim 17, wherein the first terminal displays a first user interface including a first environment screen including a screen of a first local area determined with a target position of a target virtual object in a virtual environment;
the associated picture comprises at least one of a second environment picture, a third environment picture and a situation picture;
the second environment picture comprises a picture of a second local region determined with the target position, and the area of the second local region is larger than that of the first local region;
the third environment screen includes a screen of a first local area determined by a correlation position of a correlation virtual object in the virtual environment, the correlation virtual object being an object in the same battle as the target virtual object;
the situation screen is a screen on which a map having associated information is displayed, the associated information is information included in the target virtual object and/or a visual field range of the associated virtual object, and the map is a map of the virtual environment.
19. A user interface display device, which is applied to a first terminal, the device comprising:
the display module is used for displaying a first user interface of an application program, wherein a first environment picture and a collaborative display control are displayed in the first user interface, and the first environment picture comprises a picture of a first local area determined by a target position of a target virtual object in a virtual environment;
the first short-distance wireless communication module is used for sending a request signal through a first short-distance wireless communication component when receiving a signal triggered on the cooperative display control, wherein the request signal is used for requesting the application program in the second terminal to display an associated picture, and the associated picture is a picture displayed with the association with the target virtual object; receiving a feedback signal sent by a second terminal through a second short-distance wireless communication component through the first short-distance wireless communication component, wherein the feedback signal comprises unique identification information corresponding to the application program in the second terminal;
a sending module, configured to send the unique identification information and the information of the target virtual object to a server, where the unique identification information is used to trigger the server to send information corresponding to the associated screen to the second terminal; and the information of the target virtual object is used for assisting the server to acquire the information corresponding to the associated picture according to the information of the target virtual object.
20. A user interface display apparatus, wherein the apparatus is applied in a second terminal, the apparatus comprising:
the second short-distance wireless communication module is used for receiving a request signal sent by the first terminal through the first short-distance wireless communication component through the second short-distance wireless communication component; sending a feedback signal to the first terminal through the second short-distance wireless communication assembly, wherein the feedback signal carries unique identification information of an application program in the second terminal, and the unique identification information is used for triggering a server to send information corresponding to an associated picture;
the receiving module is used for receiving the information corresponding to the associated picture sent by the server;
the processing module is used for generating the associated picture according to the information corresponding to the associated picture;
and the display module is used for displaying a second user interface of the application program, and the second user interface is displayed with the associated picture.
21. A user interface display device, which is applied in a server, the device comprising:
the receiving module is used for receiving the information and the unique identification information of the target virtual object sent by the first terminal; the target virtual object is a virtual object in an application program running in the first terminal, and the unique identification information is identification information corresponding to the application program in the second terminal;
the processing module is used for acquiring information corresponding to an associated picture according to the information of the target virtual object, wherein the associated picture is a picture associated with the target virtual object;
and the sending module is used for sending the information corresponding to the associated picture to the application program in the second terminal according to the unique identification information, wherein the information corresponding to the associated picture is used for assisting the second terminal to generate the associated picture.
22. A terminal, characterized in that the terminal comprises a processor, a memory and a first short-range wireless communication component, wherein the memory has stored therein at least one instruction, which is loaded and executed by the processor to implement the user interface display method according to any one of claims 1 to 9.
23. A terminal, characterized in that the terminal comprises a processor, a memory and a second short-range wireless communication component, wherein the memory has stored therein at least one instruction, which is loaded and executed by the processor to implement the user interface display method according to any one of claims 10 to 13.
24. A computer device comprising a processor and a memory, the memory having stored therein at least one instruction that is loaded and executed by the processor to implement a user interface display method according to any one of claims 14 to 18.
25. A computer system, characterized in that the computer system comprises a user interface display device according to claim 19, a user interface display device according to claim 20 and a user interface display device according to claim 21;
or the like, or, alternatively,
the computer system comprising the terminal according to claim 22, the terminal according to claim 23 and the computer device according to claim 24.
26. A computer-readable storage medium having stored thereon at least one instruction which is loaded and executed by a processor to implement a user interface display method according to any one of claims 1 to 18.
CN201910068959.1A 2019-01-24 2019-01-24 User interface display method, device, equipment and system Active CN109806583B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910068959.1A CN109806583B (en) 2019-01-24 2019-01-24 User interface display method, device, equipment and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910068959.1A CN109806583B (en) 2019-01-24 2019-01-24 User interface display method, device, equipment and system

Publications (2)

Publication Number Publication Date
CN109806583A CN109806583A (en) 2019-05-28
CN109806583B true CN109806583B (en) 2021-11-23

Family

ID=66603710

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910068959.1A Active CN109806583B (en) 2019-01-24 2019-01-24 User interface display method, device, equipment and system

Country Status (1)

Country Link
CN (1) CN109806583B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111672106B (en) * 2020-06-05 2022-05-24 腾讯科技(深圳)有限公司 Virtual scene display method and device, computer equipment and storage medium
CN112604302B (en) * 2020-12-17 2022-08-26 腾讯科技(深圳)有限公司 Interaction method, device, equipment and storage medium of virtual object in virtual environment
CN112704883B (en) * 2020-12-30 2022-08-05 腾讯科技(深圳)有限公司 Method, device, terminal and storage medium for grouping virtual objects in virtual environment
CN118283848A (en) * 2022-12-29 2024-07-02 华为终端有限公司 Multi-electronic-device interaction method and system and electronic device

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101594403A (en) * 2008-05-29 2009-12-02 Lg电子株式会社 Transparent display and method of operation thereof
CN101916186A (en) * 2010-07-30 2010-12-15 深圳创维-Rgb电子有限公司 Method, device and terminal for expanded display of mobile terminal view
CN102077161A (en) * 2008-06-30 2011-05-25 日本电气株式会社 Information processing device, display control method, and recording medium
CN104838353A (en) * 2012-12-07 2015-08-12 优特设备有限公司 Coordination contextual display data on a display screen
CN105247469A (en) * 2013-04-30 2016-01-13 微软技术许可有限责任公司 Automatically manipulating visualized data based on interactivity
JP5952644B2 (en) * 2012-05-31 2016-07-13 任天堂株式会社 Program, information processing method, information processing apparatus, and display system
CN107402633A (en) * 2017-07-25 2017-11-28 深圳市鹰硕技术有限公司 A kind of safety education system based on image simulation technology
CN107783741A (en) * 2016-08-25 2018-03-09 中兴通讯股份有限公司 Control the method, device and mobile terminal of multiple mobile terminal screen tiled displays
CN108126344A (en) * 2018-01-24 2018-06-08 网易(杭州)网络有限公司 The sharing method of position, storage medium in game
CN109101208A (en) * 2018-08-15 2018-12-28 网易(杭州)网络有限公司 Display methods, display device and the display system of interface

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3734815B2 (en) * 2003-12-10 2006-01-11 任天堂株式会社 Portable game device and game program
JP5573117B2 (en) * 2009-11-17 2014-08-20 ソニー株式会社 Display control system, display control device, and display control method
US20130005469A1 (en) * 2011-06-30 2013-01-03 Imerj LLC Dual screen game module
US10549204B2 (en) * 2015-09-30 2020-02-04 Sony Interactive Entertainment America Llc Systems and methods for providing augmented data-feed for game play re-creation and dynamic replay entry points
JP6270973B1 (en) * 2016-12-13 2018-01-31 株式会社コロプラ GAME METHOD AND GAME PROGRAM

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101594403A (en) * 2008-05-29 2009-12-02 Lg电子株式会社 Transparent display and method of operation thereof
CN102077161A (en) * 2008-06-30 2011-05-25 日本电气株式会社 Information processing device, display control method, and recording medium
CN101916186A (en) * 2010-07-30 2010-12-15 深圳创维-Rgb电子有限公司 Method, device and terminal for expanded display of mobile terminal view
JP5952644B2 (en) * 2012-05-31 2016-07-13 任天堂株式会社 Program, information processing method, information processing apparatus, and display system
CN104838353A (en) * 2012-12-07 2015-08-12 优特设备有限公司 Coordination contextual display data on a display screen
CN105247469A (en) * 2013-04-30 2016-01-13 微软技术许可有限责任公司 Automatically manipulating visualized data based on interactivity
CN107783741A (en) * 2016-08-25 2018-03-09 中兴通讯股份有限公司 Control the method, device and mobile terminal of multiple mobile terminal screen tiled displays
CN107402633A (en) * 2017-07-25 2017-11-28 深圳市鹰硕技术有限公司 A kind of safety education system based on image simulation technology
CN108126344A (en) * 2018-01-24 2018-06-08 网易(杭州)网络有限公司 The sharing method of position, storage medium in game
CN109101208A (en) * 2018-08-15 2018-12-28 网易(杭州)网络有限公司 Display methods, display device and the display system of interface

Also Published As

Publication number Publication date
CN109806583A (en) 2019-05-28

Similar Documents

Publication Publication Date Title
US11703993B2 (en) Method, apparatus and device for view switching of virtual environment, and storage medium
CN108619721B (en) Distance information display method and device in virtual scene and computer equipment
CN111035918B (en) Reconnaissance interface display method and device based on virtual environment and readable storage medium
CN109806583B (en) User interface display method, device, equipment and system
WO2019205881A1 (en) Method and apparatus for displaying information in virtual environment, device, and storage medium
CN108671543A (en) Labelled element display methods, computer equipment and storage medium in virtual scene
CN112494955B (en) Skill releasing method, device, terminal and storage medium for virtual object
CN111050189B (en) Live broadcast method, device, equipment and storage medium
CN110045827B (en) Method and device for observing virtual article in virtual environment and readable storage medium
CN109634413B (en) Method, device and storage medium for observing virtual environment
CN110496392B (en) Virtual object control method, device, terminal and storage medium
US11790607B2 (en) Method and apparatus for displaying heat map, computer device, and readable storage medium
CN112704883A (en) Method, device, terminal and storage medium for grouping virtual objects in virtual environment
CN111589141B (en) Virtual environment picture display method, device, equipment and medium
CN111744185B (en) Virtual object control method, device, computer equipment and storage medium
CN110448908B (en) Method, device and equipment for applying sighting telescope in virtual environment and storage medium
CN111603770A (en) Virtual environment picture display method, device, equipment and medium
CN111589127A (en) Control method, device and equipment of virtual role and storage medium
CN112691370A (en) Method, device, equipment and storage medium for displaying voting result in virtual game
CN113058264A (en) Virtual scene display method, virtual scene processing method, device and equipment
CN113289336A (en) Method, apparatus, device and medium for tagging items in a virtual environment
CN111589116A (en) Method, device, terminal and storage medium for displaying function options
CN112604302B (en) Interaction method, device, equipment and storage medium of virtual object in virtual environment
CN112755517B (en) Virtual object control method, device, terminal and storage medium
CN113134232A (en) Virtual object control method, device, equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant