CN116560504A - Interactive method, computer device and computer readable storage medium for performance site - Google Patents
Interactive method, computer device and computer readable storage medium for performance site Download PDFInfo
- Publication number
- CN116560504A CN116560504A CN202310482989.3A CN202310482989A CN116560504A CN 116560504 A CN116560504 A CN 116560504A CN 202310482989 A CN202310482989 A CN 202310482989A CN 116560504 A CN116560504 A CN 116560504A
- Authority
- CN
- China
- Prior art keywords
- target
- module
- display
- controlling
- scene
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000002452 interceptive effect Effects 0.000 title claims abstract description 37
- 238000000034 method Methods 0.000 title claims abstract description 35
- 230000003993 interaction Effects 0.000 claims abstract description 78
- 230000000694 effects Effects 0.000 claims abstract description 19
- 238000004590 computer program Methods 0.000 claims description 18
- 230000006870 function Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/023—Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Security & Cryptography (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Computer Graphics (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Information Transfer Between Computers (AREA)
- Telephonic Communication Services (AREA)
- Electrophonic Musical Instruments (AREA)
Abstract
The invention provides an interactive method, a computer device and a computer readable storage medium of a performance site, wherein the interactive method of the performance site comprises the following steps: controlling the first image acquisition module to scan a target scene, and controlling the first UWB module to communicate with the third UWB module; acquiring a first relative position of a third UWB module relative to the first UWB module; controlling a first AR display module to display a first target image of a target scene, determining a first target position of a third UWB module, and determining a first target object; acquiring third target equipment information of a third UWB module; according to the third target equipment information, a first identity request instruction is sent to a target server, and third target identity information is obtained from the target server; controlling a first AR display module to display first target identity information; sending a first interaction instruction to a target server; and controlling the first AR display module to display the first target interaction effect according to the second interaction instruction. The invention can facilitate interaction between the performer and the viewer.
Description
Technical Field
The invention relates to the technical field of Internet, in particular to an interactive method, a computer device and a computer readable storage medium for a performance site.
Background
In the performance scene, the interaction between the spectator and the performer is often limited to language, and the interaction mode is single. For example, in a concert performance scene, due to the limitations of the number of people, the viewing angle and the distance, an observer is difficult to interact with a performer, and when one observer interacts with the performer, other observers often have difficulty in seeing the interaction between the observer and the performer, so that the scene interaction experience of the observer is affected.
Disclosure of Invention
It is a first object of the present invention to provide a method of interaction at a show site that facilitates viewer interaction with a performer.
A second object of the present invention is to provide another method of interaction at a show site that facilitates viewer interaction with a performer.
A third object of the present invention is to provide another method of interaction at a show site that facilitates viewer interaction with a performer.
A fourth object of the present invention is to provide a computer apparatus for implementing the interactive method of the above-mentioned performance scene.
A fifth object of the present invention is to provide a computer readable storage medium comprising the interactive method of a performance scene as described above.
In order to achieve the first object, the present invention provides an interactive method for a performance scene, which includes the following steps; controlling the first image acquisition module to scan the target scene, and controlling the first UWB module to communicate with at least one third UWB module in the target scene; acquiring a first relative position of a third UWB module relative to the first UWB module; controlling a first AR display module to display a first target image of a target scene, determining a first target position of a third UWB module in the first target image according to the first relative position, and determining a first target object of the first target position; acquiring third target equipment information of a third UWB module; according to the third target equipment information, a first identity request instruction is sent to a target server, and third target identity information corresponding to the third target equipment information is obtained from the target server; controlling a first AR display module to display third target identity information at a first target position of a first target image; the method comprises the steps of sending a first interaction instruction to a target server and receiving a second interaction instruction from the target server; and controlling the first AR display module to display a first target interaction effect on a first target object of the first target image according to the second interaction instruction.
According to the scheme, the first UWB module is communicated with the third UWB module, the third target equipment information of the third UWB module can be obtained, the third target identity information corresponding to the third target equipment information is further determined, the image acquisition module and the first AR display module are combined, the first interaction instruction is initiated, the first target interaction effect can be intuitively presented on the first target object, interaction between an observer and a performer is facilitated, and interaction experience is improved.
The AR display module is controlled to display a target image of the target image, and when the target position of the third UWB module in the target image is determined according to the relative position, a target object of the target position is determined according to a preset image recognition algorithm.
It follows that in combination with the image recognition algorithm, the object holding the third UWB module can be determined more accurately.
Further, the performance location information is received from the target server before the image acquisition module is controlled to scan the target scene.
Therefore, the method can facilitate the spectator to go to the performance scene.
In order to achieve the second object, the present invention provides an interactive method for a performance scene, which includes the following steps: receiving a first identity request instruction from a first user terminal, determining first target identity information according to the first identity request instruction, and sending the first target identity information to the first user terminal; receiving an identity request instruction from a second user terminal, determining second target identity information according to the second identity request instruction, and sending the second target identity information to the second user terminal; and receiving a first interaction instruction from the first user terminal, determining that the first target identity information is the same as the second identity information, and sending a second interaction instruction to the first user terminal and the second user terminal.
According to the scheme, whether the first target identity information is the same as the second identity information or not is determined, so that an observer holding the first user terminal and the second user terminal is confirmed to be in a unified target scene, target images including performers are acquired through the image acquisition module and the AR display module, and after a first interaction instruction sent by the first user terminal is received, a second interaction instruction is sent to the first user terminal and the second user terminal in the target scene, so that the observer holding the second user terminal can see the interaction effect initiated by the observer holding the first user terminal, and the interaction experience of the observer in the target scene is improved.
Further, after receiving the first interaction instruction, updating the target account pointed by the first interaction instruction.
In order to achieve the third object, the present invention provides an interactive method for a performance scene, which includes the following steps; controlling the second image acquisition module to scan the target scene, and controlling the second UWB module to communicate with at least one third UWB module in the target scene; acquiring a second relative position of the third UWB module with respect to the second UWB module; controlling a second AR display module to display a second target image of the target scene, determining a second target position of a third UWB module in the second target image according to the second relative position, and determining a second target object of the second target position; acquiring third target equipment information of a third UWB module, and sending the third target equipment information to a target server; sending a second identity request instruction to a target server according to the third target identity information, and acquiring third target identity information corresponding to the third target equipment information from the target server; controlling a second AR display module to display third target identity information at a second target position of a second target image; and receiving a second interaction instruction from the target server, and controlling the second AR display module to display a second target interaction effect on a second target object of the second target image.
According to the scheme, the second UWB module is communicated with the third UWB module, the third target equipment information of the third UWB module can be obtained, the third target identity information corresponding to the third target equipment information is further determined, the image acquisition module and the first AR display module are combined, the second target interaction effect can be intuitively presented on the second target object according to the received second interaction instruction, and interaction experience is improved.
In order to achieve the fourth object, the present invention provides a computer device, including a memory and a processor, wherein the processor stores a computer program, and the computer device comprises: the processor, when executing the computer program in the memory, implements the interactive method for the performance scene.
In order to achieve the fifth object described above, the present invention provides a computer-readable storage medium storing a computer program, wherein: the computer program, when executed by the processor, implements the interactive method of the performance scene.
Drawings
Fig. 1 is an interactive system architecture diagram of a performance site embodiment of the present invention.
Fig. 2 is a flow chart of a first processor execution in an embodiment of the performance scene of the present invention.
Fig. 3 is a flow chart of a second processor execution in an embodiment of the performance scene of the present invention.
Fig. 4 is a flow chart of target server execution in a performance live embodiment of the invention.
The invention is further described below with reference to the drawings and examples.
Detailed Description
The interactive method for the performance scene is realized based on AR (Augmented Reality) technology and UWB (Ultra Wide Band) technology, and the viewing experience of an observer on the performance scene is improved. The invention also provides a computer device and a computer readable storage medium for realizing the interactive method of the performance scene.
Interactive method embodiment of performance site:
referring to fig. 1, the interactive method of the performance site according to the present embodiment is implemented based on an interactive system of the performance site, and the interactive system 1 of the performance site includes a first user terminal 11, a second user terminal 21, and a third user terminal 31, and the target server 41, where the first user terminal 11, the second user terminal 21, and the third user terminal 31 can all communicate wirelessly with the target server 41.
The first user terminal 11 includes a first image acquisition module 111, a first AR display module 112, a first UWB module 113, and a first processor 114, and the first controller 114 is connected to the first image acquisition module 111, the first AR display module 112, and the first UWB module 113, respectively.
The second user terminal 21 includes a second image acquisition module 211, a second AR display module 212, a second UWB module 213, and a second processor 214 connected to the second image acquisition module 211, the second AR display module 212, and the second UWB module 213, respectively.
The third user terminal 31 comprises a third UWB module 313.
The first UWB module 113 and the second UWB module 213 may each be in UWB communication with the third UWB module 313.
In this embodiment, the first user terminal 11, the second user terminal 21, and the third user terminal may be smart phones, and the first image acquisition module 111 and the second image acquisition module 211 may be camera modules on smart phones, and the first AR display module 112 and the second AR display module 212 may be display modules on smart phones.
The first UWB module 113 includes a unique device number, i.e., first target device information. The second UWB module 213 includes a unique device number, i.e., second target device information. The third UWB module 213 includes a unique device number, i.e., third target device information.
The target server 41 stores the corresponding relation among the account number, the target device information of the UWB module, and the target identity information. That is, the correspondence between the account number and the UWB module is stored in the target server 41, and the user can register the account number in the target server and bind the device information of one or more UWB modules. In the present embodiment, the first user terminal 11 is held by the viewer a, the second user terminal 21 is held by the viewer B, and the third user terminal 31 is held by the performer C. The target server 41 stores a correspondence between the account a of the viewer a and the first target device information, a correspondence between the account B of the viewer B and the second target device information, and a correspondence between the account C of the performer C and the third target device information. In addition, the account number C of the performer C also stores third target identity information corresponding to the third target device information, so that both the spectator a and the spectator B can request the third target identity information of the performer C from the target server 41 by acquiring the third target device information of the spectator C, and determine the identity of the performer C.
Referring to fig. 2, the interactive method for implementing the performance site of the present embodiment specifically includes the following steps executed by the first controller:
first, step S11 is performed to receive performance location information from the target server. The target server may be a first user terminal that receives performance location information, i.e., time, location, profile, etc., of the performance from the performer C, i.e., the third user terminal, and pushes the performance location information to the viewer a.
Then, step S12 is performed to control the first image acquisition module to scan the target scene, and to control the first UWB module to communicate with at least one third UWB module within the target scene.
Then, step S13 is performed to obtain a first relative position of the third UWB module with respect to the first UWB module. Based on the principle of UWB communication, the distance and angle between the first UWB module and the third UWB module can be known.
Then, step S14 is executed to control the first AR display module to display a first target image of the target scene, determine a first target position of the third UWB module in the first target image according to the first relative position, and determine a first target object of the first target position. The first target image is an image obtained directly by scanning a target scene through the first image acquisition module, the third UWB module can identify people and objects in the target image according to the existing image identification algorithm, such as YOLOv5, the distance between the people and objects in the target image and the first user terminal can be obtained according to the existing camera ranging principle, such as the binocular ranging principle, and the relative position of the third UWB module and the first user terminal is known, so that the person holding the third UWB module in the target image can be determined, and the person holding the third UWB module is the third target object in the position of the first target image, namely the first target position.
Then, step S15 is performed to acquire third target device information of a third UWB module.
Then, step S16 is executed, where a first identity request instruction is sent to the target server according to the third target device information, and third target identity information corresponding to the third target device information is obtained from the target server. And requesting the target server to acquire the third target identity information corresponding to the third target equipment information, thereby confirming the specific identity of the third target object.
Then, step S17 is executed to control the first AR display module to display the third target identity information at the first target position of the first target image. Thus, the viewer a can intuitively see the third target identity information of the first target object from the first AR display module, where the third target identity information may include identity information such as a name.
Then, step S18 is performed, in which the first interaction instruction is sent to the target server, and the second interaction instruction is received from the target server. That is, the viewer a may choose to send the first interaction instruction to the target server, and interact with the first target object after receiving the second interaction instruction from the target server, that is, the actor C. The first interactive instruction is an instruction for requesting the target server to display the first target interactive effect at the first target object, for example, requesting that the first target object in the first target image wear a hat, and the second interactive instruction may be a return of a data packet corresponding to wearing a hat or a return of a reply allowing display.
And finally, executing step S19, and controlling the first AR display module to display a first target interaction effect on a first target object of the first target image according to the second interaction instruction. The first target interaction effect may be rendered by the first controller for display on the first target object.
The above steps can realize the interaction between the spectator a and the performer C in the performance scene, that is, the target scene, and meanwhile, there is also a spectator B in the target scene, and the spectator B and the spectator a watch the performance of the performer C in different positions, so as to realize the interaction method of the performance scene in this embodiment, referring to fig. 3, the second controller further executes the following steps:
first, step S21 is executed to control the second image acquisition module to scan the target scene, and control the second UWB module to communicate with at least one third UWB module within the target scene.
Step S22 is then performed to obtain a second relative position of the third UWB module with respect to the second UWB module.
Then, step S23 is executed to control the second AR display module to display a second target image of the target scene, determine a second target position of the third UWB module in the second target image according to the second relative position, and determine a second target object of the second target position.
Then, step S24 is executed to acquire third target device information of the third UWB module, and send the third target device information to the target server.
Then, step S25 is executed, where a second identity request instruction is sent to the target server according to the third target identity information, and the third target identity information corresponding to the third target device information is obtained from the target server.
Then, step S26 is executed to control the second AR display module to display the third target identity information at the second target position of the second target image.
And finally, executing step S27, receiving a second interaction instruction from the target server, and controlling the second AR display module to display a second target interaction effect on a second target object of the second target image.
The specific execution of the steps S21 to S27 is performed with reference to the steps S11 to S19, and will not be described herein. It should be noted that, the second target object and the first target object are the same actor C, the second target interaction effect and the first target interaction effect are the same target interaction effect under different viewing angles, and since the viewer a and the viewer B are located at different positions in the target scene, and face the actor C in different directions, there is a difference in viewing angle between the first target interaction effect rendered by the first processor and the second target interaction effect rendered by the second processor, for example, the viewer a is in front of the actor C, the viewer a can see the actor C to wear a cap from the first AR display module, but only sees the front of the cap, the viewer B is in side of the actor C, and the viewer B can see the actor C to wear a cap from the second AR display module, but only sees the side of the cap.
In the above interactive method of the performance scene, the target server may synchronously send the second interactive instruction sent to the first user terminal to the second user terminal, so that the viewer B holding the second user terminal may see the interactive effect initiated by the viewer a to the performer C, specifically, referring to fig. 4, the target server performs the following steps:
step S31 is first executed, where a first identity request instruction is received from a first user terminal, first target identity information is determined according to the first identity request instruction, and the first target identity information is sent to the first user terminal. In this embodiment, the first target identity information is the same as the third target identity information.
Then, step S32 is executed to receive a second identity request command from the second user terminal, determine second target identity information according to the second identity request command, and send the second target identity information to the second user terminal. In this embodiment, the second target identity information is the same as the third target identity information.
And finally, executing step S33, wherein the first user terminal can receive the first interaction instruction, determine that the first target identity information is the same as the second target identity information, and send the second interaction instruction to the first user terminal and the second user terminal. In this embodiment, the first user terminal and the second user terminal both request to the target server to determine the third target identity information in the preset time, which indicates that the viewer a and the viewer B are in the same target scene, and the target server sends the second interaction instruction to the first user terminal and the second user terminal respectively, so that the viewer B sees the interaction effect with the performer C initiated by the viewer a.
Optionally, the target server further stores a corresponding relationship among the account, the target device relationship, and the number of tokens, the first interaction instruction may be equal to a preset number of tokens, and after the target server receives the first interaction instruction, the target account pointed by the first interaction instruction is updated, that is, the number of tokens of the account corresponding to the updated first UWB module.
In summary, the invention is used for the interaction of the performance scene, the first user terminal and the second user terminal can acquire the third target equipment information of the third UWB module through the respective UWB modules, so that the performer is determined through the third target equipment information, then the interaction is performed through the interaction instruction, the visual display of the target interaction effect of the interaction instruction is realized through the image acquisition module and the AR display module, and the interaction between the spectator and the performer in the performance scene is facilitated. Meanwhile, when the target server receives a first interaction instruction of one viewer, other viewers at the performance site where the viewer is located are determined, and second interaction instructions are sent to the other viewers, so that the viewers at the same performance site can see interaction effects initiated by the other viewers to the performer, and interaction between the viewers at the performance site and the performer is further facilitated.
Computer apparatus embodiment:
the computer device of the present embodiment includes a processor and a memory, where the memory stores a computer program, and the processor implements the interactive method of the performance scene when executing the computer program.
Computer devices may include, but are not limited to, processors and memory. Those skilled in the art will appreciate that a computer apparatus may include more or fewer components, or may combine certain components, or different components, e.g., a computer apparatus may also include input and output devices, network access devices, buses, etc.
For example, the processor may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microcontroller or the processor may be any conventional processor or the like. The processor is the control center of the computer device and connects the various parts of the entire computer device using various interfaces and lines.
The memory may be used to store computer programs and/or modules, and the controller implements various functions of the computer device by running or executing the computer programs and/or modules stored in the memory, and invoking data stored in the memory. For example, the memory may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function (e.g., a sound receiving function, a sound converting to text function, etc.), and the like; the storage data area may store data (e.g., audio data, text data, etc.) created according to the use of the handset, etc. In addition, the memory may include high-speed random access memory, and may also include non-volatile memory, such as a hard disk, memory, plug-in hard disk, smart Media Card (SMC), secure Digital (SD) Card, flash Card (Flash Card), at least one disk storage device, flash memory device, or other volatile solid-state storage device.
Computer-readable storage medium embodiments:
the modules integrated with the computer apparatus of the above embodiments may be stored in a computer-readable storage medium if implemented in the form of software functional units and sold or used as a stand-alone product. Based on such understanding, implementing all or part of the flow of the interactive method embodiment of the performance site may also be accomplished by a computer program to instruct the relevant hardware, where the computer program may be stored in a computer readable storage medium, and where the computer program, when executed by the controller, may implement the steps of the interactive method of the performance site. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, executable files or in some intermediate form, etc. The storage medium may include: any entity or device capable of carrying computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the content of the computer readable medium can be appropriately increased or decreased according to the requirements of the jurisdiction's jurisdiction and the patent practice, for example, in some jurisdictions, the computer readable medium does not include electrical carrier signals and telecommunication signals according to the jurisdiction and the patent practice.
Claims (8)
1. An interactive method for a performance site is characterized by comprising the following steps of;
controlling a first image acquisition module to scan a target scene, and controlling the first UWB module to communicate with at least one third UWB module in the target scene;
acquiring a first relative position of the third UWB module with respect to the first UWB module;
controlling a first AR display module to display a first target image of the target scene, determining a first target position of the third UWB module in the first target image according to the first relative position, and determining a first target object of the first target position;
acquiring third target equipment information of the third UWB module;
sending a first identity request instruction to a target server according to the third target equipment information, and acquiring third target identity information corresponding to the third target equipment information from the target server;
controlling the first AR display module to display the third target identity information at the first target position of the first target image;
sending a first interaction instruction to the target server, and receiving a second interaction instruction from the target server;
and controlling the first AR display module to display a first target interaction effect on the first target object of the first target image according to the second interaction instruction.
2. The method of interactive performance live of claim 1, wherein;
and controlling the AR display module to display the first target image of the target scene, and determining a target object at the first target position according to a preset image recognition algorithm when determining the target position of the third UWB module in the first target image according to the first relative position.
3. The method of interactive performance live of claim 1, wherein;
performance location information is received from the target server prior to controlling the image acquisition module to scan the target scene.
4. A method of interactive performance site comprising the steps of:
receiving a first identity request instruction from a first user terminal, determining first target identity information according to the first identity request instruction, and sending the first target identity information to the first user terminal;
receiving an identity request instruction from a second user terminal, determining the second target identity information according to the second identity request instruction, and sending the second target identity information to the second user terminal;
and receiving a first interaction instruction from the first user terminal, determining that the first target identity information is the same as the second identity information, and sending a second interaction instruction to the first user terminal and the second user terminal.
5. A method of interactive performance live according to claim 4, wherein;
and after receiving the first interaction instruction, updating the target account pointed by the first interaction instruction.
6. An interactive method for a performance site is characterized by comprising the following steps of;
controlling a second image acquisition module to scan a target scene, and controlling the second UWB module to communicate with at least one third UWB module in the target scene;
acquiring a second relative position of the third UWB module with respect to the second UWB module;
controlling a second AR display module to display a second target image of the target scene, determining a second target position of the third UWB module in the second target image according to the second relative position, and determining a second target object of the second target position;
acquiring third target equipment information of the third UWB module, and sending the third target equipment information to a target server;
sending a second identity request instruction to a target server according to the third target identity information, and acquiring third target identity information corresponding to the third target equipment information from the target server;
controlling the second AR display module to display the third target identity information at the second target position of the second target image;
and receiving a second interaction instruction from the target server, and controlling the second AR display module to display a second target interaction effect on the second target object of the second target image.
7. Computer apparatus comprising a memory and a processor, the processor storing a computer program, characterized in that:
the processor, when executing the computer program in the memory, implements the interactive method of the performance scene of any one of the preceding claims 1 to 6.
8. A computer-readable storage medium storing a computer program, characterized in that:
the computer program, when executed by a processor, implements the interactive method of the performance scene of any of the preceding claims 1 to 6.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310482989.3A CN116560504A (en) | 2023-04-28 | 2023-04-28 | Interactive method, computer device and computer readable storage medium for performance site |
PCT/CN2023/121986 WO2024221739A1 (en) | 2023-04-28 | 2023-09-27 | Interaction method for performance site, computer apparatus, and computer-readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310482989.3A CN116560504A (en) | 2023-04-28 | 2023-04-28 | Interactive method, computer device and computer readable storage medium for performance site |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116560504A true CN116560504A (en) | 2023-08-08 |
Family
ID=87495825
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310482989.3A Pending CN116560504A (en) | 2023-04-28 | 2023-04-28 | Interactive method, computer device and computer readable storage medium for performance site |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN116560504A (en) |
WO (1) | WO2024221739A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117369633A (en) * | 2023-10-07 | 2024-01-09 | 上海铱奇科技有限公司 | AR-based information interaction method and system |
WO2024221740A1 (en) * | 2023-04-28 | 2024-10-31 | 张仲元 | Information interaction methods, computer apparatus and computer-readable storage medium |
WO2024221739A1 (en) * | 2023-04-28 | 2024-10-31 | 张仲元 | Interaction method for performance site, computer apparatus, and computer-readable storage medium |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105138135B (en) * | 2015-09-15 | 2018-08-28 | 北京国承万通信息科技有限公司 | Wear-type virtual reality device and virtual reality system |
CN105892650A (en) * | 2016-03-28 | 2016-08-24 | 联想(北京)有限公司 | Information processing method and electronic equipment |
US10319128B2 (en) * | 2016-09-26 | 2019-06-11 | Rockwell Automation Technologies, Inc. | Augmented reality presentation of an industrial environment |
CN107682729A (en) * | 2017-09-08 | 2018-02-09 | 广州华多网络科技有限公司 | It is a kind of based on live interactive approach and live broadcast system, electronic equipment |
CN113709537B (en) * | 2020-05-21 | 2023-06-13 | 云米互联科技(广东)有限公司 | User interaction method based on 5G television, 5G television and readable storage medium |
CN114820992A (en) * | 2021-01-28 | 2022-07-29 | 腾讯科技(深圳)有限公司 | Identity information display method, device, terminal, server and storage medium |
CN116560504A (en) * | 2023-04-28 | 2023-08-08 | 张仲元 | Interactive method, computer device and computer readable storage medium for performance site |
-
2023
- 2023-04-28 CN CN202310482989.3A patent/CN116560504A/en active Pending
- 2023-09-27 WO PCT/CN2023/121986 patent/WO2024221739A1/en unknown
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024221740A1 (en) * | 2023-04-28 | 2024-10-31 | 张仲元 | Information interaction methods, computer apparatus and computer-readable storage medium |
WO2024221739A1 (en) * | 2023-04-28 | 2024-10-31 | 张仲元 | Interaction method for performance site, computer apparatus, and computer-readable storage medium |
CN117369633A (en) * | 2023-10-07 | 2024-01-09 | 上海铱奇科技有限公司 | AR-based information interaction method and system |
Also Published As
Publication number | Publication date |
---|---|
WO2024221739A1 (en) | 2024-10-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN116560504A (en) | Interactive method, computer device and computer readable storage medium for performance site | |
US20170195650A1 (en) | Method and system for multi point same screen broadcast of video | |
US9226005B2 (en) | Preprocessing video to insert visual elements and applications thereof | |
US20230137219A1 (en) | Image processing system and method in metaverse environment | |
CN111970524B (en) | Control method, device, system, equipment and medium for interactive live broadcast and microphone connection | |
CN109819316B (en) | Method and device for processing face sticker in video, storage medium and electronic equipment | |
CN113986177A (en) | Screen projection method, screen projection device, storage medium and electronic equipment | |
US11290752B2 (en) | Method and apparatus for providing free viewpoint video | |
CN106303663A (en) | Live treating method and apparatus, direct broadcast server | |
CN115086686A (en) | Video processing method and related device | |
CN114095671A (en) | Cloud conference live broadcast system, method, device, equipment and medium | |
US11272224B2 (en) | Information processing device and method | |
US20230132137A1 (en) | Method and apparatus for converting picture into video, and device and storage medium | |
CN103957464A (en) | Advertisement distributing method and system | |
CN114513506A (en) | Service processing method, access edge cloud server and service processing system | |
CN112752085A (en) | Naked eye 3D video playing system and method based on human eye tracking | |
JP2006041811A (en) | Free visual point picture streaming method | |
CN114422816A (en) | Live video processing method and device, electronic equipment and storage medium | |
CN108320331B (en) | Method and equipment for generating augmented reality video information of user scene | |
US10951860B2 (en) | Methods, systems, and apparatus for providing video communications | |
CN114187216B (en) | Image processing method, device, terminal equipment and storage medium | |
CN114915798A (en) | Real-time video generation method, multi-camera live broadcast method and device | |
CN112672089A (en) | Conference control and conferencing method, device, server, terminal and storage medium | |
KR20200080041A (en) | Method and apparatus for generating multi channel images using mobile terminal | |
CN112770074B (en) | Video conference realization method, device, server and computer storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |