WO2019179314A1 - 标记点位置显示方法、电子设备及计算机可读存储介质 - Google Patents
标记点位置显示方法、电子设备及计算机可读存储介质 Download PDFInfo
- Publication number
- WO2019179314A1 WO2019179314A1 PCT/CN2019/077294 CN2019077294W WO2019179314A1 WO 2019179314 A1 WO2019179314 A1 WO 2019179314A1 CN 2019077294 W CN2019077294 W CN 2019077294W WO 2019179314 A1 WO2019179314 A1 WO 2019179314A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- area
- scene
- target
- location
- target scene
- Prior art date
Links
- 239000003550 marker Substances 0.000 title claims abstract description 141
- 238000000034 method Methods 0.000 title claims abstract description 64
- 230000000694 effects Effects 0.000 claims description 33
- 230000008859 change Effects 0.000 claims description 3
- 238000004590 computer program Methods 0.000 claims 2
- 238000010586 diagram Methods 0.000 description 22
- 230000008569 process Effects 0.000 description 15
- 238000013507 mapping Methods 0.000 description 14
- 230000006870 function Effects 0.000 description 11
- 238000012545 processing Methods 0.000 description 11
- 238000004891 communication Methods 0.000 description 10
- 230000002093 peripheral effect Effects 0.000 description 10
- 230000001133 acceleration Effects 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 6
- 230000004397 blinking Effects 0.000 description 5
- 230000033001 locomotion Effects 0.000 description 4
- 230000001960 triggered effect Effects 0.000 description 4
- 238000009877 rendering Methods 0.000 description 3
- 239000000779 smoke Substances 0.000 description 3
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 3
- 241000282461 Canis lupus Species 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 2
- 239000000919 ceramic Substances 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 1
- 230000032683 aging Effects 0.000 description 1
- 238000010009 beating Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000005452 bending Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000009193 crawling Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/426—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/533—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
- A63F13/5372—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for tagging characters, objects or locations in the game scene, e.g. displaying a circle under the character controlled by the player
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
- A63F13/5378—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for displaying an additional top view, e.g. radar screens or maps
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1068—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
- A63F2300/1075—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/303—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
- A63F2300/305—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for providing a graphical or textual hint to the player
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/303—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
- A63F2300/306—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for displaying a marker associated to an object or location in the game field
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/303—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
- A63F2300/307—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for displaying an additional window with a view from the top of the game field, e.g. radar screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6045—Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
Definitions
- Embodiments of the present invention relate to the field of computer technologies, and in particular, to a marker point location display method, an electronic device, and a computer readable storage medium.
- a virtual scene of the area where the user's current location is located is displayed on the current display interface in full screen, and a global map is displayed in the upper right corner of the current display interface, and the global map covers the game site.
- the process of displaying the location of the marked point by the terminal may be: when the user selects a certain point location in the global map, the terminal displays the location of the marker on the global map. And the user's current location.
- the user can move in the direction of the marker position in the virtual scene. While moving, the terminal synchronously displays the current position gradually approaching the position of the marker point in the global map.
- the terminal When the user manually observes that the current location coincides with the location of the marker point, it is determined that the marker point location is reached in the virtual scene.
- the above method is actually manually determining whether or not the marker point position is reached.
- a point in the global map actually represents a virtual area with a larger display area in the virtual scene, for example, a city, a park, and the like. If the current location falls outside the location of the marker point in the area, the user may also observe that the current location in the global map coincides with the location of the marker point. At this time, the user does not reach the location of the marker in the virtual scene, resulting in low accuracy of the marker position display.
- the embodiment of the invention provides a method for displaying a mark point position, an electronic device and a computer readable storage medium, which can solve the problem that the accuracy of the position display of the mark point is low.
- the technical solution is as follows:
- a method for displaying a marker position comprising:
- a marker point position display device comprising:
- An acquiring module configured to acquire a mark point location in a global map of the virtual scene, where the global map is used to display a thumbnail of the virtual scene;
- a determining module configured to determine a target scene area indicated by the marked point position in the virtual scene, where the target scene area is an area marked by the marked point position;
- a display module configured to display a prompt signal in a view image of the current control object, where the prompt signal is used to prompt a position of the target scene area in the virtual scene, where the view image is the virtual scene The virtual scene that can be observed within the perspective of the current control object.
- an electronic device comprising a processor and a memory, the memory storing at least one instruction loaded by the processor and executed to implement the above-described point location display method The action performed.
- a computer readable storage medium having stored therein at least one instruction loaded by a processor and executed to perform the operations performed by the point location display method described above.
- the terminal when the terminal acquires the location of the marker point in the global map, the terminal may display a prompt signal in the perspective image of the current control object based on the target scene region indicated by the marker point location in the virtual scene, thereby clearing The location of the target scene area is accurately identified in the virtual scene, so that the current control object can accurately and quickly reach the target scene area according to the prompt signal, which greatly improves the accuracy of the actual display position of the marker point.
- FIG. 1 is a schematic diagram of a display interface according to an embodiment of the present invention.
- FIG. 2 is a flowchart of a method for displaying a mark point position according to an embodiment of the present invention
- FIG. 3 is a schematic diagram of a display interface according to an embodiment of the present invention.
- FIG. 4 is a diagram showing an example of a display interface according to an embodiment of the present invention.
- FIG. 5 is a schematic diagram of a display interface according to an embodiment of the present invention.
- FIG. 6 is a diagram showing an example of a display interface according to an embodiment of the present invention.
- FIG. 7 is a schematic diagram of a display interface according to an embodiment of the present invention.
- FIG. 8 is a diagram showing an example of a display interface according to an embodiment of the present invention.
- FIG. 9 is a schematic diagram of a display interface according to an embodiment of the present invention.
- FIG. 10 is a schematic diagram of a display interface according to an embodiment of the present invention.
- FIG. 11 is a diagram showing an example of a display interface according to an embodiment of the present invention.
- FIG. 12 is a flowchart of a method for displaying a mark point position according to an embodiment of the present invention.
- FIG. 13 is a flowchart of a method for displaying a mark point position according to an embodiment of the present invention.
- FIG. 14 is a schematic structural diagram of a marker position display device according to an embodiment of the present invention.
- FIG. 15 is a block diagram showing the structure of an electronic device 1500 according to an exemplary embodiment of the present invention.
- the embodiment of the present invention mainly relates to an electronic game scenario in which a user can perform an operation on the terminal in advance.
- the terminal After detecting the operation of the user, the terminal can download a game configuration file of the electronic game, and the game configuration file can be The application including the electronic game, page display data or virtual scene data, and the like.
- the terminal can call the game configuration file, and render and display the electronic game page based on the configuration file.
- the user can perform a touch operation on the terminal.
- the terminal can determine the game data corresponding to the touch operation, and perform rendering display on the game data.
- the game data can include virtual scene data. Behavior data of virtual objects in a virtual scene, and so on.
- the virtual scene related to the present invention may be used to simulate a virtual space, and the virtual space may be a three-dimensional virtual space or a two-dimensional virtual space, and the virtual space may be an open space, and the virtual scene may be used to simulate a real space.
- the real environment for example, the virtual scene may include sky, land, ocean, etc., and the land may include environmental elements such as deserts and cities.
- the user can control the virtual object to move in the virtual scene, where the virtual object can be a virtual avatar for representing the user in the virtual scene, and the avatar can be any form, for example, a person, an animal, or the like. This invention is not limited thereto.
- the user can control the virtual object to fall freely in the sky of the virtual scene, gliding or opening a parachute to fall, etc., running, beating, crawling, bending over, etc. on the land, and can also control The virtual object swims, floats, or dive in the ocean.
- the user can also control the virtual object to travel in the virtual scene.
- the embodiment of the present invention is not described in detail. limited.
- the user can also control the virtual object to fight with other virtual objects through the weapon.
- the weapon can be a cold weapon or a hot weapon, which is not specifically limited in the present invention.
- the user can select a destination in the global map to control the current control object to reach the destination in the virtual scene.
- the user can play the game with other users, that is, the currently controlled object is associated with other objects in the team, and the destination can be selected by any control object in the team, and the terminal can select the same
- the terminal of the control object of the destination performs data interaction to acquire the destination selected by the associated object, thereby marking the destination in the global map.
- the terminal can display the virtual scene in full screen on the current display interface, and independently display the global map in the preset area of the current display interface.
- the global map is used to display a thumbnail of the virtual scene, and the thumbnail is used to describe geographic features such as terrain, landform, and geographic location corresponding to the virtual scene.
- the preset area can be set according to user operation habits. For example, as shown in FIG. 1 , the preset area may be a rectangular area of the upper right corner or the lower left corner of the current display interface.
- the terminal may be any terminal that installs the application, which is not specifically limited in this embodiment of the present invention.
- the terminal may be any device that installs the application, such as a mobile phone terminal, a PAD (Portable Android Device) terminal, or a computer terminal.
- PAD Portable Android Device
- FIG. 2 is a flowchart of a method for displaying a marker position according to an embodiment of the present invention.
- the method is applied to an electronic device.
- the electronic device is used as a terminal as an example.
- the execution body of the method may be a terminal. Referring to FIG. 2, the method includes:
- the terminal When detecting the selected operation on the global map, the terminal acquires the position coordinates selected by the selected operation, and determines the corresponding position of the position coordinate in the global map as the marked point position.
- the marked point position is a selected position in the global map.
- the user when the user wants to reach a certain destination, the user can select the destination in the global map and mark the destination as a marker point, the marker point location is the destination in the global The location of the map.
- the position of the marker point can be represented by the position coordinate of the global map.
- the terminal may acquire the position coordinates of the marker point in the global map, and determine the corresponding position of the location coordinate in the global map as the marker point location.
- the location of the marker point can be selected based on a selected operation of the current user.
- the selected operation may be an operation triggered by a touch event of a user's finger, or an operation triggered by a click event of a mouse or other input device; therefore, the step may be: when the terminal detects a touch event or click on the global map At the time of the event, the terminal determines that there is a selected operation on the global map, and acquires the position coordinates of the touch event or the click event on the global map, and determines the position coordinate as the position coordinate selected by the selected operation.
- the terminal may mark the location of the marker by adding a preset icon, the terminal may add a preset icon at the location of the marker point, or add a preset icon at any position in the current display interface, and display the preset
- the indication information of the icon is used to indicate a preset icon of the position of the marker point, so that the preset icon does not block the surrounding area of the marker point position.
- the preset icon can be a water drop icon, an arrow, or the like.
- the terminal may display the global map in the upper right corner of the current display interface. When the user's finger selects the location of the marker point in the global map, the terminal may mark the icon of the water droplet at the location.
- FIG. 4 provides a schematic diagram of an actual terminal interface display. In combination with the effect comparison diagram shown in FIG. 4, the specific display effect can be more clearly seen.
- the user may also play a game with other users. Therefore, when a plurality of user games are played, an association relationship between the control objects respectively controlled by the multiple users may be established, and the position of the marker point may be Any one of the plurality of control objects is selected, and the control object of the selected point position shares the point position to the associated control object.
- step 201 may be replaced with the terminal receiving the point location selected by the associated object of the current control object.
- the associated object may directly send the location coordinates of the location of the marker point.
- the terminal receives the location coordinates sent by the terminal of the associated object, and determines the location of the location coordinate in the global map as the marker point location.
- the associated object selection is selected by the associated user associated with the user.
- the associated object selection process is a similar process selected by the selected operation of the current user, and is not performed here. Narration.
- the terminal may also display the global map when receiving a display instruction triggered by the user.
- the step of acquiring, by the terminal, the location coordinates selected by the selected operation may be: when the terminal detects a display instruction indicating that the global map is displayed, displaying the global map at the current A preset area of the display interface, when detecting the selected operation on the global map, acquiring the position coordinates selected by the selected operation on the global map.
- the display instruction may be triggered by a display button of the current display interface of the terminal, a specified voice, or a specified action such as shaking, rotation, or the like performed by the user on the terminal.
- the terminal may further detect whether the user needs to perform the selected operation in the global map, and if necessary, the terminal may further enlarge the global map to enable the user to Zoomed in to the selected global map.
- the step of acquiring, by the terminal, the location coordinates selected by the selected operation may be: when detecting a trigger operation of the preset area of the terminal, the terminal The global map is switched from the first window to the second window display; when the selected operation on the global map of the second window is detected, the terminal acquires the position coordinates selected by the selected operation.
- the display area of the second window is larger than the first window, and the triggering operation is a touch event of the finger in the preset area or a click event of the mouse in the preset area.
- the preset area may be set as needed, for example, the preset area may be the entire area occupied by the first window, or the central area, the edge area, and the like of the area occupied by the first window; the first window may be The upper right corner area and the lower left corner area of the current display interface.
- the second window may be a left half screen area or a right half screen area of the current display interface.
- the global map may be located in a partial screen area such as an upper right corner and a lower right corner of the terminal screen, and the terminal performs display by switching the global map to an area with a larger display area, thereby releasing the screen size of the terminal. Limitations improve the accuracy of selected marker locations and improve the user experience.
- the terminal may first display the global map in the upper right corner of the terminal screen, that is, as shown in the left figure.
- the terminal switches the global map to the right half screen area. Display, which is shown on the right.
- the terminal may also display the location of the current control object in the global map, so that the user can directly observe the positional relationship between the current location of the current control object and the location of the marker point from the map.
- the terminal may further display a current moving direction of the current control object in the global map, so that the user can adjust the current moving direction based on the position of the marked point in time to avoid a situation such as a detour. Thereby the point location is reached as soon as possible.
- FIG. 6 provides a schematic diagram of an actual terminal interface display. In combination with the effect comparison diagram shown in FIG. 6, the specific display effect can be more clearly seen.
- the terminal may mark the location of the current control object and the associated object in the global map.
- the terminal may also set the current color of the marker point location to the location marker of the associated object that selects the marker point. Color, so that the user can visually know the object that selects the location of the marker from a visual perspective.
- FIG. 8 provides a schematic diagram of an actual terminal interface display. The specific display effect can be more clearly seen in conjunction with the effect comparison diagram shown in FIG. 5b.
- the process may be
- the terminal acquires the object identifier of the associated object, and sends the marker point location to the associated object according to the object identifier.
- the object identifier may be a game ID (Identity), a nickname, or the like of the associated object.
- the terminal establishes a communication connection with the server, and the terminal can send the marked point location to the server, and the server forwards the marked point location to the terminal where the associated object is located.
- the terminal may directly establish a communication connection with the terminal where the associated object is located, for example, a Bluetooth connection, a NFC (Near Field Communication) connection, etc., and the terminal directly locates the marker point based on the communication connection. Send to the terminal where the associated object is located.
- a Bluetooth connection for example, a Bluetooth connection, a NFC (Near Field Communication) connection, etc.
- NFC Near Field Communication
- the location of the marker point may be marked on the global map, and then the location of the marker point is sent to the associated object, or the location of the marker point may be first sent to the associated object, and then marked. Or, the terminal can also send and mark at the same time.
- the embodiment of the present invention does not specifically limit this.
- the terminal may determine the target scene region indicated by the marker point position based on the location of the marker point in the global map, and mark the target scene region in the virtual scene.
- the target scene area so that the current control object can clearly know the location of the marker point.
- the terminal determines the target scene area indicated by the marked point location in the virtual scene.
- the target scene area is an area that is marked by the location of the marker point, and the global map displays a thumbnail of the virtual scene.
- the global map has a mapping relationship with the virtual scene, and the terminal may pre-store the mapping relationship, and the terminal determines, according to the mapping relationship, the target scene area indicated by the marked point location in the virtual scene.
- the mapping relationship may be a mapping relationship between a location point in the global map and a location point in the virtual scene, or may be a mapping relationship between the location point in the global map and the scene region in the virtual scenario, based on the mapping relationship.
- the terminal determines, according to the location coordinates of the marker point location in the global map, the corresponding scene location coordinates of the marker point location in the virtual scene, and determines the region within the first preset range of the scene location coordinates. For this target scene area.
- the terminal may pre-store the mapping relationship between the location coordinates in the global map and the scene location coordinates in the virtual scene, and the terminal obtains the location coordinates of the global map according to the location of the marker point from the mapping relationship.
- the marker point is located in the corresponding scene position coordinate in the virtual scene; and the terminal uses the first preset range corresponding to the scene position coordinate as the target scene region according to the preset position generation principle according to the scene position coordinate.
- the preset generation principle may be set as needed, which is not specifically limited in the embodiment of the present invention.
- the preset generation principle may be an area with the position position coordinate as the center and the preset distance as the radius. As shown in FIG. 9 , the area within the first preset range is the point O, the radius. A circular area for R.
- the preset generation principle may be a square centered on the position coordinate of the scene and having a preset distance as a side length, and the area in the first preset range is an O point and a side length a. Square area.
- the terminal expands the location point into an area within a certain area according to the location point, thereby accurately determining the target in the virtual scene.
- the scene area improves the recognition of the current user's target scene area and improves the user experience.
- the terminal determines, according to the location coordinates of the marker point location location coordinates of the global map, the location location coordinates of the marker point location in the virtual scene, and determines the region corresponding to the feature element on the scene location coordinate as the location Target scene area.
- the terminal may further store a mapping relationship between the scene position coordinates and the feature elements, and the terminal determines the scene position coordinates of the marker point position in the virtual scene, and according to the scene position coordinates, the scene position coordinates and the feature elements In the mapping relationship, the feature element corresponding to the scene position coordinate is determined, and the area where the feature element is located is used as the target scene area.
- the feature element may be an element having a certain contour or a building structure in the virtual scene.
- the feature element may be a lake, a park, a villa, a hospital, or the like.
- the manner in which the terminal determines the position of the scene position of the marker point in the virtual scene is the same as the manner of the first method described above, and details are not described herein.
- the terminal may correspond to the location of the marker to a certain feature element in the virtual scene, for example, a hospital, a park, etc.; the target scene area is made clearer, and the target scene in the virtual scene is improved.
- the degree of identification of the area Moreover, by matching the location of the marker point to a certain building or geographic element, the geographic extent of the target scene region is visually more clearly defined, the accuracy of determining the target scene region is improved, and the accuracy is improved.
- the current control object can know the target scene area in advance based on the characteristics of the feature, the outline, and the like of the feature element, so that The game experience, ready to work, and greatly improve the user's gaming experience.
- the terminal searches for a corresponding target scene region in the virtual scene from the mapping relationship between the location coordinates and the scene region according to the location coordinates of the marker point location in the global map.
- the global map is actually a thumbnail image of the virtual scene that is reduced according to a preset ratio.
- the location point in the global map may be used to represent a scene area in the virtual scene, and the terminal may associate the global storage in advance.
- the terminal determines the scene area corresponding to the position coordinate from the positional relationship of the position of the mark on the global map, and uses the scene area as the target scene area corresponding to the mark point position.
- the terminal may further send the target scene area to the associated object, so that the associated object knows the target scene area.
- the terminal may also send a marker point location to the associated object, and the associated object determines the target scene region based on the marker point location.
- the terminal can also send the marker point location and the target scene region to the associated object. This embodiment of the present invention does not specifically limit this.
- the terminal can quickly determine the target scene area based on the mapping relationship, thereby greatly improving the processing efficiency of the terminal.
- the terminal may further determine whether the target scene region is included in the perspective image based on the perspective image of the current control object and the location of the target scene region, and mark the target scene region based on the determination result.
- the terminal displays the prompt signal in a direction of the target scene area in the perspective image.
- the view image is a virtual scene that can be observed in the range of the view of the current control object in the virtual scene, and the prompt signal is used to prompt the position of the target scene region in the view image.
- the terminal may display the prompt signal in a horizon position of the direction of the target scene area in the perspective image.
- the prompt signal may be a wolf smoke along the direction of the target scene area, a distant horizon position, a flashing signal light, and the like.
- the terminal may image the general direction of the target scene area in the virtual scene even if the range of the view image of the current control object is limited, so as to guide the current control object toward the The direction of the target scene area is moved, so that the current control object can quickly find the target scene area.
- the terminal displays the prompt signal in a second preset range of the target scene area in the perspective image.
- the prompt signal may be a text prompt, an icon prompt, and/or a display special effect
- the terminal may display the prompt signal in a second preset range of the target scene area in the perspective image, so that the current control object can be accurate. Find the target scene area.
- the second preset range may be an area where the target scene area is located, an upper right corner area of the target scene area, and the like, which are not specifically limited in this embodiment of the present invention.
- this step can be implemented in the following three ways.
- the step may be: the terminal displays the text prompt in the third preset range corresponding to the perspective image in the target scene area.
- the terminal may pre-store the text prompt corresponding to the target scene area. After the terminal determines the target scene area, the terminal obtains the text prompt corresponding to the target scene area from the specified storage space, and inserts the text prompt into the target scene area. Within the third preset range.
- the third preset range may be the same as or different from the second preset range.
- the third preset range may be an upper right corner area of the target scene area, or a central area of the target scene area, and the like.
- the text prompt can be set based on the user's needs. The embodiment of the present invention does not specifically limit this. For example, the text prompt may be “here is the destination!”, “the destination is here!”, and the like.
- the step may be: the terminal adds the icon prompt in the fourth preset range corresponding to the view image in the target scene area.
- the icon prompt may be set based on the user's needs, and is not specifically limited in this embodiment of the present invention.
- the icon prompt may be a finger icon, a flag icon, or the like pointing to the target scene area.
- the specific implementation of this step is the same as the first method, and will not be described here.
- the step may be: the terminal adds the display special effect in a corresponding fifth preset range in the perspective image in the target scene area.
- the display special effect may be to display the color of the target scene area, add some dynamic pictures to the target scene area, or perform blinking display on the blinking target scene area.
- the step may be: the terminal changes the original display color of the target scene area to the target color, or displays a dynamic picture on the layer of the target scene area, or plays the flicker in the fifth preset range. Special effects.
- the terminal can implement the process of adding the display effect by adding a layer. Specifically, the terminal may add a layer on the layer of the target scene area. If the terminal changes the original display color of the target scene area to the target color, in the added layer, the terminal places the target scene area in the layer. The corresponding area in the added image is drawn to the target color. If the terminal displays a dynamic picture on the layer of the target scene area, the terminal draws the dynamic picture in the surrounding area corresponding to the target scene area, and the dynamic picture may be moved around the target scene area. A picture of the smoke, a picture of the jump of the associated teammate who has reached the target scene area, and the like.
- the terminal may highlight the target scene area once every preset period, thereby achieving the effect of the blinking display of the target scene area.
- the target color is different from the original display color of the target scene area, or the target color may be different from the display color of the surrounding scene area of the target display scene area, and the terminal may display according to the original display of the target scene area. Color, the color that is different from the original display color is the target color. Alternatively, the terminal may also use the color of the display color of the surrounding scene area as the target color, so that the target scene area can be clearly distinguished from the surrounding scene area.
- the terminal may modify the target scene area to a target color to distinguish other areas in the perspective image.
- FIG. 11 is a schematic diagram showing an actual terminal interface display. In combination with the effect comparison diagram shown in FIG. 11, the specific display effect can be more clearly seen.
- the terminal visually distinguishes the target scene area from the surrounding area, thereby attracting the attention of the current control object, thereby achieving eye-catching Prompt effect.
- the terminal displays various prompt signals such as text prompts, icon prompts, or display special effects in the preset range of the target scene area, prompting the user, increasing the interest in the game process, and improving the user's interest.
- the terminal may further add any two or all of the text prompt, the icon prompt, or the display special effect to the target scene region by using the foregoing three implementation manners, thereby highlighting the target in the perspective image.
- the foregoing steps 203-204 are actually a specific implementation manner of “the terminal displays the prompt signal in the perspective image of the current control object”, and the above step is actually that the current control object reaches the target scene region from far to near.
- the terminal provides a prompt signal for the target scene area.
- the terminal may also display the prompt signal only when the current state of the current control object satisfies the target display condition.
- the target display condition can be set based on the need, which is not specifically limited in the embodiment of the present invention.
- the target display condition may be that the position of the current control object does not exceed the target distance from the target scene area, the current user corresponding to the current control object triggers the prompt request, and the current control object is hit.
- the terminal stops displaying the prompt signal in the perspective image.
- the location of the marker point has a certain timeliness, and the terminal may determine whether to continue to prompt the target scene region corresponding to the marker point position based on the current aging of the marker point location. If the current control object reaches the target scene area, or the point position is updated, the point position is invalid.
- the terminal can monitor the location of the current control object in real time, and determine whether the current control object reaches the target scene area based on the location. When the location of the current control object is in the target scene area, the terminal determines that the current control object has arrived. In the target scene area, step 203 or 204 is stopped. During the movement of the current control object, the location of the marker point can also be updated in real time. When the location of the marker point in the global map is updated to a location of a marker point different from the location of the marker point, the terminal also stops performing step 203 or 204. .
- the location of the marker point may be updated by the current control object, or may be updated by the associated object of the current control object. If the current control object updates the location of the marker point, that is, the process of determining the location of the marker point based on the selected operation, the process is the same as that of step 201 above, and details are not described herein again. If the location of the marker is updated by the associated object, the terminal receives the updated marker location sent by the terminal of the association object, and displays the updated marker location on the global map, and the terminal passes the above steps 202-204. The illustrated process determines and marks the target scene area indicated by the updated marker point location.
- the terminal when the terminal detects a triggering operation on the screen, it is determined whether the triggering operation satisfies a specific condition position, and the specific condition position may be whether the operating position of the triggering operation falls within a preset area.
- the terminal acquires the location of the marker point on the global map, and converts the marker point location into a target scene region in the virtual scene, and the terminal may also synchronize the marker point location with the terminal of the associated object.
- the terminal displays the position of the current control object in the global map coincident with the position of the marker point, since the current control object can be further determined based on the target scene region marked in the perspective image, when the current control object is in the target scene region At this time, it is determined that the current control object reaches the position of the marker point, thereby solving the problem of the judgment error caused by the low accuracy of the global map, greatly improving the accuracy of the judgment, and thereby achieving the effect of accurate display.
- the terminal when the terminal acquires the location of the marker point in the global map, the terminal may display a prompt signal in the perspective image of the current control object based on the target scene region indicated by the marker point location in the virtual scene, thereby clearing The location of the target scene area is accurately identified in the virtual scene, so that the current control object can accurately and quickly reach the target scene area according to the prompt signal, which greatly improves the accuracy of the actual display position of the marker point.
- FIG. 13 is a flowchart of a method for displaying a marker position according to an embodiment of the present invention. The method can be applied to a terminal. Referring to FIG. 13, the method includes:
- the terminal acquires a touch event or a click event on a screen of the terminal.
- FIG. 3 is an application interface of the game application, and the user can perform operations on the application interface, for example, performing a finger touch screen operation or a mouse click operation in the application interface.
- the terminal determines the corresponding location of the operation location in the global map as the marker point location.
- the terminal may display the global map in the upper right corner of the current display interface.
- the terminal may mark the location at the location. Water drop icon.
- the terminal may also switch the global map to the right half screen area display.
- the terminal sends the marked point location to the terminal where the associated object of the current control object is located.
- the terminal can also display the location of the associated object on the current display interface.
- the terminal determines, according to the location coordinates of the marker point location on the global map, the target scene region that is marked in the virtual scene by the marker point location.
- the target scene area may be an area within the first preset range of the corresponding scene position coordinate of the point position in the virtual scene; or an area where the feature element corresponding to the scene position coordinate is located; or The position of the marker point in the corresponding position of the position coordinate of the global map in the correspondence between the position coordinate and the scene area.
- the terminal may use a circular area having a coordinate of the scene position as a center, that is, an O point and a radius R, as the target scene area.
- the terminal may also use a square centered on the scene position coordinate and a preset distance as a side length as the target scene area.
- the terminal determines whether the view image of the current control object includes the target scene area.
- the view image is a virtual scene that can be observed within a range of views of the current control object in the virtual scene.
- the terminal displays the prompt signal in the perspective position of the target scene area in the perspective image.
- the prompt signal may be a wolf smoke along the direction of the target scene area, a distant horizon position, a flashing signal light, and the like.
- the terminal displays a text prompt, an icon prompt, or a display special effect in the target scene area.
- the target scene area may be a circular area
- the terminal may mark the circular area as a target color different from the surrounding color to distinguish other areas in the perspective image.
- the terminal stops displaying a text prompt, an icon prompt, or a display effect in the perspective image.
- steps 1301-1308 are the same as the steps of the steps 201-205 of the previous embodiment. In the embodiment of the present invention, the specific processes of the steps 1301-1308 are not described again.
- the terminal acquires the position of the marker point in the global map, synchronizes the position of the marker point to the associated object, and clearly and accurately corresponds to the location of the marker point based on the location of the marker point in the perspective image of the current control object.
- the scene area is marked so that the current control object can accurately and quickly reach the target scene area according to the prompt signal, which greatly improves the accuracy of the actual display position of the marker point.
- FIG. 14 is a schematic structural diagram of a marker position display device according to an embodiment of the present invention.
- the apparatus includes: an obtaining module 1401, a determining module 1402, and a display module 1403.
- the obtaining module 1401 is configured to acquire a mark point location in a global map of the virtual scene, where the global map is used to display a thumbnail of the virtual scene;
- a determining module 1402 configured to determine a target scene area indicated by the marked point position in the virtual scene, where the target scene area is an area marked by the marked point position;
- the display module 1403 is configured to display a prompt signal in the perspective image of the current control object, where the prompt signal is used to prompt the location of the target scene area in the virtual scene, where the perspective image is the perspective of the current control object in the virtual scene.
- the prompt signal is used to prompt the location of the target scene area in the virtual scene, where the perspective image is the perspective of the current control object in the virtual scene.
- the determining module 1402 is further configured to determine, according to the location coordinates of the marker point location in the global map, the corresponding scene location coordinates of the marker point location in the virtual scene, and the first pre-position of the scene location coordinate The area within the range is determined as the target scene area.
- the determining module 1402 is further configured to determine, according to the location coordinates of the marker point location, the location location coordinates of the marker point location in the virtual scene, and the feature element on the scene location coordinate The corresponding area is determined as the target scene area.
- the determining module 1402 is further configured to: according to the location coordinates of the mark point location in the global map, look up the corresponding target scene region in the virtual scene from the correspondence between the location coordinates and the scene region.
- the display module 1403 is further configured to display the prompt signal in a second preset range of the target scene area if the target scene area is included in the perspective image of the current control object.
- the display module 1403 is further configured to display the prompt signal in a direction of the target scene region in the perspective image if the target scene region is not included in the perspective image of the current control object.
- the display module 1403 is further configured to display the text prompt in a corresponding third preset range in the view image in the target scene area.
- the display module 1403 is further configured to add the icon prompt in the fourth preset range corresponding to the view image in the target scene area.
- the display module 1403 is further configured to add the display special effect in a corresponding fifth preset range in the view image in the target scene area.
- the display module 1403 is further configured to change the original display color of the target scene area to a specified color.
- the display module 1403 is further configured to display a dynamic picture on a layer of the target scene area.
- the display module 1403 is further configured to play a blinking effect in the fifth preset range.
- the device further includes:
- a stopping module configured to stop displaying the prompt signal in the view image when the position of the current control object is in the target scene area or the mark point position is updated.
- the obtaining module 1401 is further configured to: when detecting the selected operation on the global map, acquire a position coordinate selected by the selected operation, and determine a position corresponding to the position coordinate as the mark point position. .
- the obtaining module 1401 is further configured to receive the location of the marker selected by the associated object of the current control object.
- the terminal when the terminal acquires the location of the marker point in the global map, the terminal may display a prompt signal in the perspective image of the current control object based on the target scene region indicated by the marker point location in the virtual scene, thereby clearing The location of the target scene area is accurately identified in the virtual scene, so that the current control object can accurately and quickly reach the target scene area according to the prompt signal, which greatly improves the accuracy of the actual display position of the marker point.
- the marker point position display device provided by the above embodiment is displayed at the position of the marker point, only the division of each function module described above is illustrated. In practical applications, the function may be assigned different functions as needed.
- the module is completed, that is, the internal structure of the terminal is divided into different functional modules to complete all or part of the functions described above.
- the marking point position display device provided by the above embodiment is the same as the embodiment of the marking point position display method, and the specific implementation process is described in detail in the method embodiment, and details are not described herein again.
- the marker point position display device provided by the above embodiment is displayed at the position of the marker point, only the division of each function module described above is illustrated. In practical applications, the function may be assigned different functions as needed.
- the module is completed, that is, the internal structure of the terminal is divided into different functional modules to complete all or part of the functions described above.
- the marking point position display device provided by the above embodiment is the same as the embodiment of the marking point position display method, and the specific implementation process is described in detail in the method embodiment, and details are not described herein again.
- FIG. 15 is a block diagram showing the structure of an electronic device 1500 according to an exemplary embodiment of the present invention.
- the electronic device 1500 can be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III), and an MP4 (Moving Picture Experts Group Audio Layer IV). Audio level 4) Player, laptop or desktop computer.
- Electronic device 1500 may also be referred to as a user device, a portable terminal, a laptop terminal, a desktop terminal, and the like.
- the electronic device 1500 includes a processor 1501 and a memory 1502.
- the processor 1501 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like.
- the processor 1501 may be configured by at least one of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). achieve.
- the processor 1501 may also include a main processor and a coprocessor.
- the main processor is a processor for processing data in an awake state, which is also called a CPU (Central Processing Unit); the coprocessor is A low-power processor for processing data in standby.
- the processor 1501 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and rendering of the content that the display needs to display.
- the processor 1501 may also include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
- AI Artificial Intelligence
- Memory 1502 can include one or more computer readable storage media, which can be non-transitory.
- the memory 1502 may also include high speed random access memory, as well as non-volatile memory such as one or more magnetic disk storage devices, flash memory storage devices.
- the non-transitory computer readable storage medium in memory 1502 is configured to store at least one instruction for execution by processor 1501 to implement the points provided by the method embodiments of the present application. The following steps in the location display method:
- the processor 1501 is further configured to:
- the processor 1501 is further configured to:
- the processor 1501 is further configured to:
- the target scene area corresponding to the position coordinate in the virtual scene is searched for.
- the processor 1501 is further configured to:
- the prompt signal is displayed in a second preset range of the target scene area.
- the processor 1501 is further configured to:
- the prompt signal is displayed in the direction of the target scene area in the perspective image.
- the processor 1501 is further configured to:
- a text prompt is displayed in the target scene area in a corresponding third preset range in the perspective image.
- the processor 1501 is further configured to:
- an icon prompt is added in a corresponding fourth preset range in the perspective image.
- the processor 1501 is further configured to:
- a display effect is added in the target preset scene area within a corresponding fifth preset range in the perspective image.
- the processor 1501 is further configured to:
- the processor 1501 is further configured to:
- a dynamic picture is displayed on the layer above the target scene area.
- the processor 1501 is further configured to:
- a flicker effect is played within the fifth preset range.
- the processor 1501 is further configured to:
- the processor 1501 is further configured to:
- the position coordinates selected by the selected operation are obtained, and the position corresponding to the position coordinate is determined as the mark point position.
- the processor 1501 is further configured to:
- the location of the marker selected by the associated object of the current control object is received.
- the electronic device 1500 optionally further includes: a peripheral device interface 1503 and at least one peripheral device.
- the processor 1501, the memory 1502, and the peripheral device interface 1503 may be connected by a bus or a signal line.
- Each peripheral device can be connected to the peripheral device interface 1503 via a bus, signal line, or circuit board.
- the peripheral device includes at least one of a radio frequency circuit 1504, a touch display screen 1505, a camera 1506, an audio circuit 1507, a positioning component 1508, and a power source 1509.
- Peripheral device interface 1503 can be used to connect at least one peripheral device associated with an I/O (Input/Output) to processor 1501 and memory 1502.
- processor 1501, memory 1502, and peripheral interface 1503 are integrated on the same chip or circuit board; in some other embodiments, any one of processor 1501, memory 1502, and peripheral interface 1503 or The two can be implemented on a separate chip or circuit board, which is not limited in this embodiment.
- the RF circuit 1504 is for receiving and transmitting an RF (Radio Frequency) signal, also called an electromagnetic signal.
- Radio frequency circuit 1504 communicates with the communication network and other communication devices via electromagnetic signals.
- the radio frequency circuit 1504 converts the electrical signal into an electromagnetic signal for transmission, or converts the received electromagnetic signal into an electrical signal.
- the radio frequency circuit 1504 includes an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and the like.
- Radio frequency circuit 1504 can communicate with other terminals via at least one wireless communication protocol.
- the wireless communication protocol includes, but is not limited to, a metropolitan area network, various generations of mobile communication networks (2G, 3G, 4G, and 5G), a wireless local area network, and/or a WiFi (Wireless Fidelity) network.
- the radio frequency circuit 1504 may further include an NFC (Near Field Communication) related circuit, which is not limited in this application.
- the display 1505 is used to display a UI (User Interface).
- the UI can include graphics, text, icons, video, and any combination thereof.
- display 1505 is a touch display
- display 1505 also has the ability to capture touch signals over the surface or surface of display 1505.
- the touch signal can be input to the processor 1501 as a control signal for processing.
- the display 1505 can also be used to provide virtual buttons and/or virtual keyboards, also referred to as soft buttons and/or soft keyboards.
- the display screen 1505 can be one, and the front panel of the electronic device 1500 is disposed; in other embodiments, the display screen 1505 can be at least two, respectively disposed on different surfaces of the electronic device 1500 or in a folded design.
- the display screen 1505 can be a flexible display screen disposed on a curved surface or a folded surface of the electronic device 1500. Even the display screen 1505 can be set as a non-rectangular irregular pattern, that is, a profiled screen.
- the display screen 1505 can be prepared by using an LCD (Liquid Crystal Display) or an OLED (Organic Light-Emitting Diode).
- Camera component 1506 is used to capture images or video.
- camera assembly 1506 includes a front camera and a rear camera.
- the front camera is placed on the front panel of the terminal, and the rear camera is placed on the back of the terminal.
- the rear camera is at least two, which are respectively a main camera, a depth camera, a wide-angle camera, and a telephoto camera, so as to realize the background blur function of the main camera and the depth camera, and the main camera Combine with a wide-angle camera for panoramic shooting and VR (Virtual Reality) shooting or other integrated shooting functions.
- camera assembly 1506 can also include a flash.
- the flash can be a monochrome temperature flash or a two-color temperature flash.
- the two-color temperature flash is a combination of a warm flash and a cool flash that can be used for light compensation at different color temperatures.
- the audio circuit 1507 can include a microphone and a speaker.
- the microphone is used to collect sound waves of the user and the environment, and convert the sound waves into electrical signals for processing to the processor 1501 for processing, or input to the RF circuit 1504 for voice communication.
- the microphones may be multiple, and are respectively disposed at different parts of the electronic device 1500.
- the microphone can also be an array microphone or an omnidirectional acquisition microphone.
- the speaker is then used to convert electrical signals from the processor 1501 or the RF circuit 1504 into sound waves.
- the speaker can be a conventional film speaker or a piezoelectric ceramic speaker.
- the audio circuit 1507 can also include a headphone jack.
- the location component 1508 is used to locate the current geographic location of the electronic device 1500 to implement navigation or LBS (Location Based Service).
- the positioning component 1508 can be a positioning component based on a US-based GPS (Global Positioning System), a Chinese Beidou system, a Russian Greiner system, or an EU Galileo system.
- a power source 1509 is used to power various components in the electronic device 1500.
- the power source 1509 can be an alternating current, a direct current, a disposable battery, or a rechargeable battery.
- the rechargeable battery can support wired charging or wireless charging.
- the rechargeable battery can also be used to support fast charging technology.
- electronic device 1500 also includes one or more sensors 1510.
- the one or more sensors 1510 include, but are not limited to, an acceleration sensor 1511, a gyro sensor 1512, a pressure sensor 1513, a fingerprint sensor 1514, an optical sensor 1515, and a proximity sensor 1516.
- the acceleration sensor 1511 can detect the magnitude of the acceleration on the three coordinate axes of the coordinate system established by the electronic device 1500.
- the acceleration sensor 1511 can be used to detect components of gravity acceleration on three coordinate axes.
- the processor 1501 can control the touch display 1505 to display the user interface in a landscape view or a portrait view according to the gravity acceleration signal collected by the acceleration sensor 1511.
- the acceleration sensor 1511 can also be used for the acquisition of game or user motion data.
- the gyro sensor 1512 can detect the body direction and the rotation angle of the electronic device 1500, and the gyro sensor 1512 can cooperate with the acceleration sensor 1511 to collect the 3D motion of the user on the electronic device 1500. Based on the data collected by the gyro sensor 1512, the processor 1501 can implement functions such as motion sensing (such as changing the UI according to the user's tilting operation), image stabilization at the time of shooting, game control, and inertial navigation.
- functions such as motion sensing (such as changing the UI according to the user's tilting operation), image stabilization at the time of shooting, game control, and inertial navigation.
- the pressure sensor 1513 can be disposed on a side border of the electronic device 1500 and/or a lower layer of the touch display screen 1505.
- the pressure sensor 1513 is disposed on the side frame of the electronic device 1500, the user's holding signal to the electronic device 1500 can be detected, and the processor 1501 performs left and right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1513.
- the operability control on the UI interface is controlled by the processor 1501 according to the user's pressure operation on the touch display screen 1505.
- the operability control includes at least one of a button control, a scroll bar control, an icon control, and a menu control.
- the fingerprint sensor 1514 is configured to collect the fingerprint of the user, and the processor 1501 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 1514, or the fingerprint sensor 1514 identifies the identity of the user according to the collected fingerprint. Upon identifying that the identity of the user is a trusted identity, the processor 1501 authorizes the user to perform related sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying and changing settings, and the like.
- the fingerprint sensor 1514 can be disposed on the front, back, or side of the electronic device 1500. When the physical button or vendor logo is provided on the electronic device 1500, the fingerprint sensor 1514 can be integrated with the physical button or the manufacturer logo.
- Optical sensor 1515 is used to collect ambient light intensity.
- the processor 1501 can control the display brightness of the touch display 1505 based on the ambient light intensity acquired by the optical sensor 1515. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1505 is raised; when the ambient light intensity is low, the display brightness of the touch display screen 1505 is lowered.
- the processor 1501 can also dynamically adjust the shooting parameters of the camera assembly 1506 according to the ambient light intensity acquired by the optical sensor 1515.
- Proximity sensor 1516 also referred to as a distance sensor, is typically disposed on the front panel of electronic device 1500. Proximity sensor 1516 is used to capture the distance between the user and the front of electronic device 1500. In one embodiment, when the proximity sensor 1516 detects that the distance between the user and the front side of the electronic device 1500 is gradually decreasing, the processor 1501 controls the touch display screen 1505 to switch from the bright screen state to the interest screen state; when the proximity sensor 1516 When it is detected that the distance between the user and the front side of the electronic device 1500 gradually becomes larger, the processor 1501 controls the touch display screen 1505 to switch from the screen state to the bright screen state.
- FIG. 15 does not constitute a limitation to electronic device 1500, may include more or fewer components than illustrated, or may combine certain components, or employ different component arrangements.
- a computer readable storage medium such as a memory including instructions executable by a processor in a terminal to perform a point location display method in the embodiments described below.
- the computer readable storage medium can be a ROM, a random access memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, and an optical data storage device.
- a person skilled in the art may understand that all or part of the steps of implementing the above embodiments may be completed by hardware, or may be instructed by a program to execute related hardware, and the program may be stored in a computer readable storage medium.
- the storage medium mentioned may be a read only memory, a magnetic disk or an optical disk or the like.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims (20)
- 一种标记点位置显示方法,其特征在于,所述方法应用在电子设备中,所述方法包括:获取虚拟场景的全局地图中的标记点位置,所述全局地图用于显示所述虚拟场景的缩略图;确定所述标记点位置在所述虚拟场景中指示的目标场景区域,所述目标场景区域为所述标记点位置所标定的区域;在当前控制对象的视角图像中显示提示信号,所述提示信号用于提示所述目标场景区域在所述虚拟场景中的位置,所述视角图像为所述虚拟场景中所述当前控制对象的视角范围内所能观测到的虚拟场景。
- 根据权利要求1所述的方法,其特征在于,所述确定所述标记点位置在所述虚拟场景中指示的目标场景区域包括:根据所述标记点位置在所述全局地图的位置坐标,确定所述标记点位置在所述虚拟场景中对应的场景位置坐标,将所述场景位置坐标的第一预设范围内的区域确定为所述目标场景区域。
- 根据权利要求1所述的方法,其特征在于,所述确定所述标记点位置在所述虚拟场景中指示的目标场景区域包括:根据所述标记点位置在所述全局地图的位置坐标,确定所述标记点位置在所述虚拟场景中的场景位置坐标,将所述场景位置坐标上的地物要素对应的区域确定为所述目标场景区域。
- 根据权利要求1所述的方法,其特征在于,所述确定所述标记点位置在所述虚拟场景中指示的目标场景区域包括:根据所述标记点位置在所述全局地图的位置坐标,从位置坐标和场景区域的对应关系中,查找所述位置坐标在虚拟场景中对应的目标场景区域。
- 根据权利要求1所述的方法,其特征在于,所述在当前控制对象的视角 图像中显示提示信号包括:如果所述当前控制对象的视角图像中包括所述目标场景区域,在所述目标场景区域的第二预设范围内显示所述提示信号。
- 根据权利要求1所述的方法,其特征在于,所述在当前控制对象的视角图像中显示提示信号包括:如果所述当前控制对象的视角图像中不包括所述目标场景区域,在所述视角图像中所述目标场景区域的方向上显示所述提示信号。
- 根据权利要求1所述的方法,其特征在于,所述在当前控制对象的视角图像中显示提示信号包括:在所述目标场景区域在所述视角图像中对应的第三预设范围内显示文字提示。
- 根据权利要求1所述的方法,其特征在于,所述在当前控制对象的视角图像中显示提示信号包括:在所述目标场景区域在所述视角图像中对应的第四预设范围内添加图标提示。
- 根据权利要求1所述的方法,其特征在于,所述在当前控制对象的视角图像中显示提示信号包括:在所述目标场景区域在所述视角图像中对应的第五预设范围内添加显示特效。
- 根据权利要求9所述的方法,其特征在于,所述在所述目标场景区域在所述视角图像中对应的第五预设范围内添加显示特效包括:将所述目标场景区域的原显示颜色更改为目标颜色。
- 根据权利要求9所述的方法,其特征在于,所述在所述目标场景区域在所述视角图像中对应的第五预设范围内添加显示特效包括:在所述目标场景区域的图层上层显示动态图片。
- 根据权利要求9所述的方法,其特征在于,所述在所述目标场景区域在所述视角图像中对应的第五预设范围内添加显示特效包括:在所述第五预设范围内播放闪烁特效。
- 根据权利要求1所述的方法,其特征在于,所述方法还包括:当所述当前控制对象的位置处于所述目标场景区域中或者所述标记点位置被更新时,停止在所述视角图像中显示提示信号。
- 根据权利要求1所述的方法,其特征在于,所述获取虚拟场景的全局地图中的标记点位置包括:当检测到所述全局地图上的选定操作时,获取所述选定操作所选定的位置坐标,将所述位置坐标对应的位置确定为所述标记点位置。
- 根据权利要求1所述的方法,其特征在于,所述获取虚拟场景的全局地图中的标记点位置包括:接收由所述当前控制对象的关联对象选定的所述标记点位置。
- 一种电子设备,其特征在于,所述电子设备包括:一个或多个处理器;用于存放计算机程序的存储器;其中,所述处理器用于执行存储器上所存放的计算机程序,以实现下述步骤:获取虚拟场景的全局地图中的标记点位置,所述全局地图用于显示所述虚拟场景的缩略图;确定所述标记点位置在所述虚拟场景中指示的目标场景区域,所述目标场景区域为所述标记点位置所标定的区域;在当前控制对象的视角图像中显示提示信号,所述提示信号用于提示所述目标场景区域在所述虚拟场景中的位置,所述视角图像为所述虚拟场景中所述 当前控制对象的视角范围内所能观测到的虚拟场景。
- 根据权利要求16所述的电子设备,其特征在于,所述处理器还用于执行:根据所述标记点位置在所述全局地图的位置坐标,确定所述标记点位置在所述虚拟场景中对应的场景位置坐标,将所述场景位置坐标的第一预设范围内的区域确定为所述目标场景区域。
- 根据权利要求16所述的电子设备,其特征在于,所述处理器还用于执行:根据所述标记点位置在所述全局地图的位置坐标,确定所述标记点位置在所述虚拟场景中的场景位置坐标,将所述场景位置坐标上的地物要素对应的区域确定为所述目标场景区域。
- 根据权利要求16所述的电子设备,其特征在于,所述处理器还用于执行:如果所述当前控制对象的视角图像中包括所述目标场景区域,在所述目标场景区域的第二预设范围内显示所述提示信号。
- 一种计算机可读存储介质,其特征在于,所述存储介质中存储有至少一条指令,所述指令由处理器加载并执行以实现如权利要求1至权利要求15任一项所述的标记点位置显示方法所执行的操作。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020207018243A KR102410802B1 (ko) | 2018-03-22 | 2019-03-07 | 마커 포인트 위치를 표시하기 위한 방법, 전자 장치, 및 컴퓨터 판독가능 저장매체 |
JP2020540758A JP7058842B2 (ja) | 2018-03-22 | 2019-03-07 | マークポイント位置表示方法、電子機器及びコンピュータ読取可能な記憶媒体 |
US16/888,142 US11221726B2 (en) | 2018-03-22 | 2020-05-29 | Marker point location display method, electronic device, and computer-readable storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810241937.6 | 2018-03-22 | ||
CN201810241937.6A CN108465240B (zh) | 2018-03-22 | 2018-03-22 | 标记点位置显示方法、装置、终端及计算机可读存储介质 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/888,142 Continuation US11221726B2 (en) | 2018-03-22 | 2020-05-29 | Marker point location display method, electronic device, and computer-readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019179314A1 true WO2019179314A1 (zh) | 2019-09-26 |
Family
ID=63265685
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/077294 WO2019179314A1 (zh) | 2018-03-22 | 2019-03-07 | 标记点位置显示方法、电子设备及计算机可读存储介质 |
Country Status (5)
Country | Link |
---|---|
US (1) | US11221726B2 (zh) |
JP (1) | JP7058842B2 (zh) |
KR (1) | KR102410802B1 (zh) |
CN (1) | CN108465240B (zh) |
WO (1) | WO2019179314A1 (zh) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111369684A (zh) * | 2019-12-10 | 2020-07-03 | 杭州海康威视系统技术有限公司 | 目标跟踪方法、装置、设备及存储介质 |
CN111773703A (zh) * | 2020-07-31 | 2020-10-16 | 网易(杭州)网络有限公司 | 游戏对象显示方法、装置、存储介质与终端设备 |
CN112507651A (zh) * | 2020-11-30 | 2021-03-16 | 上海柏楚数控科技有限公司 | 电路板设计的数据处理方法、装置、电子设备与存储介质 |
CN113082702A (zh) * | 2021-04-15 | 2021-07-09 | 网易(杭州)网络有限公司 | 游戏的显示控制方法及电子设备 |
CN113426125A (zh) * | 2021-07-02 | 2021-09-24 | 网易(杭州)网络有限公司 | 游戏中的虚拟单位控制方法及装置、存储介质、电子设备 |
CN113608616A (zh) * | 2021-08-10 | 2021-11-05 | 深圳市慧鲤科技有限公司 | 虚拟内容的显示方法及装置、电子设备和存储介质 |
CN113722644A (zh) * | 2021-09-03 | 2021-11-30 | 北京房江湖科技有限公司 | 基于外接设备在虚拟空间选取浏览点位的方法及其装置 |
WO2022252927A1 (zh) * | 2021-06-03 | 2022-12-08 | 腾讯科技(深圳)有限公司 | 虚拟对象的位置提示方法、装置、终端及存储介质 |
CN115967779A (zh) * | 2022-12-27 | 2023-04-14 | 北京爱奇艺科技有限公司 | 虚拟摄像机机位图的展示方法、装置、电子设备和介质 |
Families Citing this family (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108465240B (zh) | 2018-03-22 | 2020-08-11 | 腾讯科技(深圳)有限公司 | 标记点位置显示方法、装置、终端及计算机可读存储介质 |
CN109701271A (zh) * | 2018-12-24 | 2019-05-03 | 网易(杭州)网络有限公司 | 显示图标的方法、装置和系统 |
CN111385525B (zh) * | 2018-12-28 | 2021-08-17 | 杭州海康机器人技术有限公司 | 视频监控方法、装置、终端及系统 |
CN109568957B (zh) * | 2019-01-10 | 2020-02-07 | 网易(杭州)网络有限公司 | 游戏中的显示控制方法、装置、存储介质、处理器及终端 |
CN109976650B (zh) * | 2019-01-25 | 2021-03-02 | 网易(杭州)网络有限公司 | 人机交互方法、装置和电子设备 |
CN109806585B (zh) * | 2019-02-19 | 2024-02-23 | 网易(杭州)网络有限公司 | 游戏的显示控制方法、装置、设备和存储介质 |
CN111991800B (zh) * | 2019-02-22 | 2024-06-21 | 网易(杭州)网络有限公司 | 游戏角色控制方法、装置、设备和存储介质 |
CN110058684B (zh) * | 2019-03-21 | 2022-07-01 | 海南诺亦腾海洋科技研究院有限公司 | 一种基于vr技术的地理信息交互方法、系统及存储介质 |
CN110115838B (zh) * | 2019-05-30 | 2021-10-29 | 腾讯科技(深圳)有限公司 | 虚拟环境中生成标记信息的方法、装置、设备及存储介质 |
CN110270098B (zh) | 2019-06-21 | 2023-06-23 | 腾讯科技(深圳)有限公司 | 控制虚拟对象对虚拟物品进行标记的方法、装置及介质 |
CN110473146B (zh) * | 2019-08-16 | 2022-12-27 | 苏州超擎图形软件科技发展有限公司 | 遥感图像显示方法、装置及存储介质和计算机设备 |
CN110807826B (zh) * | 2019-10-30 | 2023-04-07 | 腾讯科技(深圳)有限公司 | 虚拟场景中的地图显示方法、装置、设备及存储介质 |
CN111318019B (zh) * | 2020-03-02 | 2021-12-28 | 腾讯科技(深圳)有限公司 | 虚拟对象的控制方法、装置、终端及存储介质 |
CN111589114B (zh) * | 2020-05-12 | 2023-03-10 | 腾讯科技(深圳)有限公司 | 虚拟对象的选择方法、装置、终端及存储介质 |
CN111760288B (zh) * | 2020-06-10 | 2024-03-12 | 网易(上海)网络有限公司 | 虚拟三维场景中方位显示方法、装置、终端和存储介质 |
CN111672125B (zh) * | 2020-06-10 | 2022-03-01 | 腾讯科技(深圳)有限公司 | 一种虚拟对象交互的方法以及相关装置 |
CN111744181B (zh) * | 2020-07-01 | 2024-07-23 | 网易(杭州)网络有限公司 | 游戏中信息的显示方法、装置、游戏客户端及介质 |
CN113064955B (zh) * | 2020-08-26 | 2022-02-25 | 视伴科技(北京)有限公司 | 一种显示地理标记信息的方法及装置 |
CN112116719B (zh) * | 2020-08-28 | 2024-05-10 | 北京五一视界数字孪生科技股份有限公司 | 三维场景中对象的确定方法、装置、存储介质和电子设备 |
US11351457B2 (en) * | 2020-09-11 | 2022-06-07 | Riot Games, Inc. | Selecting an anchored offset targeting position |
CN112121417B (zh) * | 2020-09-30 | 2022-04-15 | 腾讯科技(深圳)有限公司 | 虚拟场景中的事件处理方法、装置、设备及存储介质 |
CN112539752B (zh) * | 2020-12-11 | 2023-12-26 | 维沃移动通信有限公司 | 室内定位方法、室内定位装置 |
CN113018864B (zh) * | 2021-03-26 | 2024-02-13 | 网易(杭州)网络有限公司 | 虚拟对象的提示方法、装置、存储介质及计算机设备 |
CN113318428B (zh) * | 2021-05-25 | 2024-09-20 | 网易(杭州)网络有限公司 | 游戏的显示控制方法、非易失性存储介质及电子装置 |
CN113421343B (zh) * | 2021-05-27 | 2024-06-04 | 深圳市晨北科技有限公司 | 基于增强现实观测设备内部结构的方法 |
CN113436253B (zh) * | 2021-06-28 | 2023-05-16 | 华科融资租赁有限公司 | 地图的定位显示方法、装置、计算机设备和存储介质 |
CN113440847B (zh) * | 2021-07-01 | 2024-10-25 | 网易(杭州)网络有限公司 | 标记信息的处理方法和装置 |
CN113546429B (zh) * | 2021-08-12 | 2024-07-02 | 网易(杭州)网络有限公司 | 虚拟对象降落方法、装置、计算机设备及存储介质 |
JP7319635B2 (ja) * | 2021-11-02 | 2023-08-02 | グリー株式会社 | コンピュータプログラム、仮想空間表示装置、及び仮想空間表示方法 |
CN114332417B (zh) * | 2021-12-13 | 2023-07-14 | 亮风台(上海)信息科技有限公司 | 一种多人场景交互的方法、设备、存储介质及程序产品 |
CN114296609B (zh) * | 2022-03-09 | 2022-05-31 | 广州三七极耀网络科技有限公司 | 一种界面处理方法、装置、电子设备及存储介质 |
CN114935973A (zh) * | 2022-04-11 | 2022-08-23 | 北京达佳互联信息技术有限公司 | 互动处理方法、装置、设备及存储介质 |
CN118831322A (zh) * | 2023-04-24 | 2024-10-25 | 腾讯科技(深圳)有限公司 | 消息处理方法和装置、存储介质及电子设备 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0981772A (ja) * | 1995-09-19 | 1997-03-28 | Namco Ltd | 画像合成方法及び装置 |
JPH10113466A (ja) * | 1996-08-21 | 1998-05-06 | Namco Ltd | ゲーム装置、画像合成方法及び情報記憶媒体 |
US20070010325A1 (en) * | 2005-07-06 | 2007-01-11 | Yu Suzuki | Video game control program and video game device |
CN107715454A (zh) * | 2017-09-01 | 2018-02-23 | 网易(杭州)网络有限公司 | 信息处理方法、装置、电子设备及存储介质 |
CN107741818A (zh) * | 2017-09-01 | 2018-02-27 | 网易(杭州)网络有限公司 | 信息处理方法、装置、电子设备及存储介质 |
CN108465240A (zh) * | 2018-03-22 | 2018-08-31 | 腾讯科技(深圳)有限公司 | 标记点位置显示方法、装置、终端及计算机可读存储介质 |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10309381A (ja) * | 1997-05-13 | 1998-11-24 | Yoshimasa Tanaka | 移動体用遊戯装置 |
US6079982A (en) * | 1997-12-31 | 2000-06-27 | Meader; Gregory M | Interactive simulator ride |
US7788323B2 (en) * | 2000-09-21 | 2010-08-31 | International Business Machines Corporation | Method and apparatus for sharing information in a virtual environment |
US20040113887A1 (en) * | 2002-08-27 | 2004-06-17 | University Of Southern California | partially real and partially simulated modular interactive environment |
US8458028B2 (en) * | 2002-10-16 | 2013-06-04 | Barbaro Technologies | System and method for integrating business-related content into an electronic game |
JP2006162552A (ja) | 2004-12-10 | 2006-06-22 | Pioneer Electronic Corp | 情報表示装置、情報表示方法および情報表示プログラム |
US7817150B2 (en) * | 2005-09-30 | 2010-10-19 | Rockwell Automation Technologies, Inc. | Three-dimensional immersive system for representing an automation control environment |
KR100809479B1 (ko) * | 2006-07-27 | 2008-03-03 | 한국전자통신연구원 | 혼합 현실 환경을 위한 얼굴 착용형 디스플레이 장치 |
US20080030429A1 (en) * | 2006-08-07 | 2008-02-07 | International Business Machines Corporation | System and method of enhanced virtual reality |
JP2008070705A (ja) | 2006-09-15 | 2008-03-27 | Cad Center:Kk | 画像提供システムおよび画像提供サーバ装置 |
US20080125218A1 (en) * | 2006-09-20 | 2008-05-29 | Kelly James Collins | Method of use for a commercially available portable virtual reality system |
US9387402B2 (en) * | 2007-09-18 | 2016-07-12 | Disney Enterprises, Inc. | Method and system for converting a computer virtual environment into a real-life simulation environment |
US8368721B2 (en) * | 2007-10-06 | 2013-02-05 | Mccoy Anthony | Apparatus and method for on-field virtual reality simulation of US football and other sports |
JP2011214861A (ja) | 2010-03-31 | 2011-10-27 | Zenrin Co Ltd | 案内システム |
US9338622B2 (en) * | 2012-10-04 | 2016-05-10 | Bernt Erik Bjontegard | Contextually intelligent communication systems and processes |
KR102332752B1 (ko) * | 2014-11-24 | 2021-11-30 | 삼성전자주식회사 | 지도 서비스를 제공하는 전자 장치 및 방법 |
US10142795B2 (en) * | 2015-02-16 | 2018-11-27 | Tourblend Innovations, Llc | Providing digital content for multiple venues |
-
2018
- 2018-03-22 CN CN201810241937.6A patent/CN108465240B/zh active Active
-
2019
- 2019-03-07 JP JP2020540758A patent/JP7058842B2/ja active Active
- 2019-03-07 KR KR1020207018243A patent/KR102410802B1/ko active IP Right Grant
- 2019-03-07 WO PCT/CN2019/077294 patent/WO2019179314A1/zh active Application Filing
-
2020
- 2020-05-29 US US16/888,142 patent/US11221726B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0981772A (ja) * | 1995-09-19 | 1997-03-28 | Namco Ltd | 画像合成方法及び装置 |
JPH10113466A (ja) * | 1996-08-21 | 1998-05-06 | Namco Ltd | ゲーム装置、画像合成方法及び情報記憶媒体 |
US20070010325A1 (en) * | 2005-07-06 | 2007-01-11 | Yu Suzuki | Video game control program and video game device |
CN107715454A (zh) * | 2017-09-01 | 2018-02-23 | 网易(杭州)网络有限公司 | 信息处理方法、装置、电子设备及存储介质 |
CN107741818A (zh) * | 2017-09-01 | 2018-02-27 | 网易(杭州)网络有限公司 | 信息处理方法、装置、电子设备及存储介质 |
CN108465240A (zh) * | 2018-03-22 | 2018-08-31 | 腾讯科技(深圳)有限公司 | 标记点位置显示方法、装置、终端及计算机可读存储介质 |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111369684A (zh) * | 2019-12-10 | 2020-07-03 | 杭州海康威视系统技术有限公司 | 目标跟踪方法、装置、设备及存储介质 |
CN111369684B (zh) * | 2019-12-10 | 2023-09-01 | 杭州海康威视系统技术有限公司 | 目标跟踪方法、装置、设备及存储介质 |
CN111773703A (zh) * | 2020-07-31 | 2020-10-16 | 网易(杭州)网络有限公司 | 游戏对象显示方法、装置、存储介质与终端设备 |
CN111773703B (zh) * | 2020-07-31 | 2023-10-20 | 网易(杭州)网络有限公司 | 游戏对象显示方法、装置、存储介质与终端设备 |
CN112507651A (zh) * | 2020-11-30 | 2021-03-16 | 上海柏楚数控科技有限公司 | 电路板设计的数据处理方法、装置、电子设备与存储介质 |
CN113082702A (zh) * | 2021-04-15 | 2021-07-09 | 网易(杭州)网络有限公司 | 游戏的显示控制方法及电子设备 |
WO2022252927A1 (zh) * | 2021-06-03 | 2022-12-08 | 腾讯科技(深圳)有限公司 | 虚拟对象的位置提示方法、装置、终端及存储介质 |
CN113426125A (zh) * | 2021-07-02 | 2021-09-24 | 网易(杭州)网络有限公司 | 游戏中的虚拟单位控制方法及装置、存储介质、电子设备 |
CN113608616A (zh) * | 2021-08-10 | 2021-11-05 | 深圳市慧鲤科技有限公司 | 虚拟内容的显示方法及装置、电子设备和存储介质 |
CN113722644A (zh) * | 2021-09-03 | 2021-11-30 | 北京房江湖科技有限公司 | 基于外接设备在虚拟空间选取浏览点位的方法及其装置 |
CN113722644B (zh) * | 2021-09-03 | 2023-07-21 | 如你所视(北京)科技有限公司 | 基于外接设备在虚拟空间选取浏览点位的方法及其装置 |
CN115967779A (zh) * | 2022-12-27 | 2023-04-14 | 北京爱奇艺科技有限公司 | 虚拟摄像机机位图的展示方法、装置、电子设备和介质 |
Also Published As
Publication number | Publication date |
---|---|
CN108465240B (zh) | 2020-08-11 |
JP2021511602A (ja) | 2021-05-06 |
KR20200087260A (ko) | 2020-07-20 |
KR102410802B1 (ko) | 2022-06-22 |
US20200293154A1 (en) | 2020-09-17 |
US11221726B2 (en) | 2022-01-11 |
JP7058842B2 (ja) | 2022-04-25 |
CN108465240A (zh) | 2018-08-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019179314A1 (zh) | 标记点位置显示方法、电子设备及计算机可读存储介质 | |
CN108619721B (zh) | 虚拟场景中的距离信息显示方法、装置及计算机设备 | |
CN108710525B (zh) | 虚拟场景中的地图展示方法、装置、设备及存储介质 | |
KR102500722B1 (ko) | 가상 프롭 이전 방법 및 장치와, 전자 기기 및 컴퓨터 저장 매체 | |
WO2019153824A1 (zh) | 虚拟对象控制方法、装置、计算机设备及存储介质 | |
WO2019153750A1 (zh) | 用于对虚拟环境进行视角切换的方法、装置、设备及存储介质 | |
JP7026819B2 (ja) | カメラの位置決め方法および装置、端末並びにコンピュータプログラム | |
WO2019205881A1 (zh) | 虚拟环境中的信息显示方法、装置、设备及存储介质 | |
CN111921197B (zh) | 对局回放画面的显示方法、装置、终端及存储介质 | |
WO2019179237A1 (zh) | 获取实景电子地图的方法、装置、设备和存储介质 | |
CN109646944B (zh) | 控制信息处理方法、装置、电子设备及存储介质 | |
CN112044069A (zh) | 虚拟场景中的对象提示方法、装置、设备及存储介质 | |
CN113198178B (zh) | 虚拟对象的位置提示方法、装置、终端及存储介质 | |
JP2023139033A (ja) | 視点回転の方法、装置、端末およびコンピュータプログラム | |
JP7413563B2 (ja) | 仮想オブジェクトの制御方法、装置、機器及びコンピュータプログラム | |
WO2022227915A1 (zh) | 显示位置标记的方法、装置、设备及存储介质 | |
CN110152309B (zh) | 语音通信方法、装置、电子设备及存储介质 | |
US12061773B2 (en) | Method and apparatus for determining selected target, device, and storage medium | |
WO2022237076A1 (zh) | 虚拟对象的控制方法、装置、设备及计算机可读存储介质 | |
CN112057861B (zh) | 虚拟对象控制方法、装置、计算机设备及存储介质 | |
CN111754631B (zh) | 三维模型的生成方法、装置、设备及可读存储介质 | |
CN113209610A (zh) | 虚拟场景画面展示方法、装置、计算机设备及存储介质 | |
CN116920398A (zh) | 在虚拟世界中的探查方法、装置、设备、介质及程序产品 | |
CN113599809A (zh) | 虚拟地图显示方法、装置、设备及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19772116 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20207018243 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2020540758 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19772116 Country of ref document: EP Kind code of ref document: A1 |