Nothing Special   »   [go: up one dir, main page]

WO2024221739A1 - Interaction method for performance site, computer apparatus, and computer-readable storage medium - Google Patents

Interaction method for performance site, computer apparatus, and computer-readable storage medium Download PDF

Info

Publication number
WO2024221739A1
WO2024221739A1 PCT/CN2023/121986 CN2023121986W WO2024221739A1 WO 2024221739 A1 WO2024221739 A1 WO 2024221739A1 CN 2023121986 W CN2023121986 W CN 2023121986W WO 2024221739 A1 WO2024221739 A1 WO 2024221739A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
module
display
uwb module
interaction
Prior art date
Application number
PCT/CN2023/121986
Other languages
French (fr)
Chinese (zh)
Inventor
张仲元
Original Assignee
张仲元
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 张仲元 filed Critical 张仲元
Publication of WO2024221739A1 publication Critical patent/WO2024221739A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/023Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Definitions

  • the present invention relates to the field of Internet technology, and more particularly to an interactive method for a performance site, a computer device, and a computer-readable storage medium.
  • This application is based on a Chinese invention application filed on April 28, 2023, with application number CN202310482989.3, the contents of which are incorporated herein by reference.
  • the interaction between the audience and the performers is often limited to language, and the interaction method is relatively simple.
  • the interaction method is relatively simple.
  • the audience due to the limitations of the number of people, viewing angles, and distance, it is difficult for the audience to interact with the performers.
  • other viewers often find it difficult to see the interaction between the viewer and the performer, which affects the audience's on-site interactive experience.
  • the first object of the present invention is to provide an interactive method for a performance scene that facilitates interaction between viewers and performers.
  • the second object of the present invention is to provide another interactive method at a performance site that facilitates interaction between viewers and performers.
  • the third object of the present invention is to provide another interactive method at a performance site that facilitates interaction between viewers and performers.
  • a fourth object of the present invention is to provide a computer device for implementing the above-mentioned on-site performance interaction method.
  • a fifth object of the present invention is to provide a computer-readable storage medium including the above-mentioned performance scene interactive method.
  • the present invention provides an interactive method for a performance site, which includes the following steps: controlling the first image acquisition module to scan the target scene, controlling the first UWB module to communicate with at least one third UWB module in the target scene; obtaining the first relative position of the third UWB module relative to the first UWB module; controlling the first AR display module to display the first target image of the target scene, determining the first target position of the third UWB module in the first target image according to the first relative position, and determining the first target object at the first target position; obtaining the third target device information of the third UWB module; sending a first identity request instruction to the target server according to the third target device information, and obtaining the third target identity information corresponding to the third target device information from the target server; controlling the first AR display module to display the third target identity information at the first target position of the first target image; sending a first interaction instruction to the target server, and receiving a second interaction instruction from the target server; according to the second interaction instruction, controlling
  • a further solution is to control the AR display module to display the target image of the target image, and when determining the target position of the third UWB module in the target image according to the relative position, determine the target object of the target position according to a preset image recognition algorithm.
  • a further solution is to receive performance location information from a target server before controlling the image acquisition module to scan the target scene.
  • the present invention provides an interactive method for a performance site, which includes the following steps: receiving a first identity request instruction from a first user terminal, determining first target identity information according to the first identity request instruction, and sending the first target identity information to the first user terminal; receiving an identity request instruction from a second user terminal, determining second target identity information according to the second identity request instruction, and sending the second target identity information to the second user terminal; receiving a first interaction instruction from the first user terminal, determining that the first target identity information is the same as the second identity information, and sending a second interaction instruction to the first user terminal and the second user terminal.
  • a further solution is to update the target account pointed to by the first interaction instruction after receiving the first interaction instruction.
  • the present invention provides an interactive method for a performance site, which includes the following steps: controlling the second image acquisition module to scan the target scene, controlling the second UWB module to communicate with at least one third UWB module in the target scene; obtaining the second relative position of the third UWB module relative to the second UWB module; controlling the second AR display module to display the second target image of the target scene, determining the second target position of the third UWB module in the second target image according to the second relative position, and determining the second target object at the second target position; obtaining the third target device information of the third UWB module, and sending the third target device information to the target server; sending a second identity request instruction to the target server according to the third target identity information, and obtaining the third target identity information corresponding to the third target device information from the target server; controlling the second AR display module to display the third target identity information at the second target position of the second target image; receiving the second interaction instruction from the target server, and controlling the second AR display module to display
  • the present invention provides a computer device, including a memory and a processor, wherein the processor stores a computer program, wherein: when the processor executes the computer program in the memory, the above-mentioned interactive method of the performance site is implemented.
  • the present invention provides a computer-readable storage medium storing a computer program, wherein: when the computer program is executed by a processor, the above-mentioned interactive method of the performance site is implemented.
  • the present invention can obtain the third target device information of the third UWB module through communication between the first UWB module and the third UWB module, and then determine the third target identity information corresponding to the third target device information.
  • the first target interaction effect can be intuitively presented on the first target object, which facilitates the interaction between the viewer and the performer and improves the interactive experience.
  • the object holding the third UWB module can be determined more accurately.
  • the present invention determines whether the first target identity information and the second identity information are the same, thereby confirming that the viewers holding the first user terminal and the second user terminal are in the same target scene, and both obtain the target image including the performer through the image acquisition module and the AR display module, so that after receiving the first interaction instruction sent by the first user terminal, the second interaction instruction is sent to the first user terminal and the second user terminal in the target scene, so that the viewer holding the second user terminal can see the interaction effect initiated by the viewer holding the first user terminal, thereby improving the interaction experience of the viewer in the target scene.
  • the present invention can obtain the third target device information of the third UWB module through communication between the second UWB module and the third UWB module, and then determine the third target identity information corresponding to the third target device information.
  • the second target interaction effect can be intuitively presented on the second target object, thereby improving the interaction experience.
  • FIG. 1 is a diagram of an interactive system architecture of a performance scene according to a performance scene embodiment of the present invention.
  • FIG. 2 is a flow chart executed by the first processor in the live performance embodiment of the present invention.
  • FIG. 3 is a flow chart executed by the second processor in the live performance embodiment of the present invention.
  • FIG. 4 is a flow chart executed by the target server in the performance scene embodiment of the present invention.
  • the present invention implements an interactive method for a performance scene based on AR (Augmented Reality) technology and UWB (Ultra Wide Band) technology to improve the viewing experience of viewers at the performance scene.
  • the present invention also provides a computer device and a computer-readable storage medium for implementing the interactive method for the performance scene.
  • the interactive method at the performance site of this embodiment is implemented based on the interactive system at the performance site.
  • the interactive system 1 at the performance site includes a first user terminal 11, a second user terminal 21, a third user terminal 31, and a target server 41.
  • the first user terminal 11, the second user terminal 21, and the third user terminal 31 can all communicate wirelessly with the target server 41.
  • the first user terminal 11 includes a first image acquisition module 111, a first AR display module 112, a first UWB module 113, and a first processor 114.
  • the first processor 114 is connected to the first image acquisition module 111, the first AR display module 112, and the first UWB module 113 respectively.
  • the second user terminal 21 includes a second image acquisition module 211 , a second AR display module 212 , a second UWB module 213 , and a second processor 214 , which are respectively connected to the second image acquisition module 211 , the second AR display module 212 , and the second UWB module 213 .
  • the third user terminal 31 includes a third UWB module 313 .
  • Both the first UWB module 113 and the second UWB module 213 can perform UWB communication with the third UWB module 313 .
  • the first user terminal 11, the second user terminal 21, and the third user terminal can all be smart phones, then the first image acquisition module 111 and the second image acquisition module 211 can both be camera modules on the smart phone, and the first AR display module 112 and the second AR display module 212 can both be display modules on the smart phone.
  • the first UWB module 113 includes a unique device number, namely, first target device information.
  • the second UWB module 213 includes a unique device number, namely, second target device information.
  • the third UWB module 213 includes a unique device number, namely, third target device information.
  • the target server 41 stores the correspondence between the account, the target device information of the UWB module, and the target identity information. That is, the target server 41 stores the correspondence between the account and the UWB module, and the user can register an account on the target server and bind the device information of one or more UWB modules.
  • viewer A holds the first user terminal 11
  • viewer B holds the second user terminal 21
  • performer C holds the third user terminal 31.
  • the target server 41 stores the correspondence between viewer A's account A and the first target device information, the correspondence between viewer B's account B and the second target device information, and the correspondence between performer C's account C and the third target device information.
  • performer C's account C also stores the third target identity information corresponding to the third target device information, so that both viewer A and viewer B can request performer C's third target identity information from the target server 41 by obtaining viewer C's third target device information to determine performer C's identity.
  • the interactive method for implementing the performance scene of this embodiment specifically includes the following steps executed by the first processor:
  • step S11 is executed to receive performance location information from the target server.
  • the performance location information is information such as the time, location, and introduction of the performance.
  • the target server may receive the performance location information from the performer C, i.e., the third user terminal, and push it to the first user terminal of the viewer A.
  • step S12 is executed to control the first image acquisition module to scan the target scene, and control the first UWB module to communicate with at least one third UWB module in the target scene.
  • step S13 is performed to obtain a first relative position of the third UWB module relative to the first UWB module. Based on the principle of UWB communication, the distance and angle between the first UWB module and the third UWB module can be known.
  • step S14 is executed to control the first AR display module to display the first target image of the target scene, determine the first target position of the third UWB module in the first target image according to the first relative position, and determine the first target object at the first target position.
  • the first target image is an image directly obtained by scanning the target scene by the first image acquisition module.
  • the third UWB module is in the target image.
  • the existing image recognition algorithm such as YOLOv5
  • the person and object in the target image can be identified.
  • the existing camera ranging principle such as the binocular ranging principle, the distance of the person and object in the target image relative to the first user terminal can be obtained.
  • the person holding the third UWB module in the target image can be determined.
  • the person holding the third UWB module is at the first target position of the first target image, and the person holding the third UWB module is the third target object.
  • step S15 is executed to obtain the third target device information of the third UWB module.
  • step S16 is executed to send a first identity request instruction to the target server according to the third target device information, and obtain the third target identity information corresponding to the third target device information from the target server, that is, request the target server to obtain the third target identity information corresponding to the third target device information, so as to confirm the specific identity of the third target object.
  • step S17 is executed to control the first AR display module to display the third target identity information at the first target position of the first target image.
  • the viewer A can directly see the third target identity information of the first target object from the first AR display module, and the third target identity information may include identity information such as name.
  • step S18 is executed to send a first interaction instruction to the target server and receive a second interaction instruction from the target server. That is, viewer A can choose to send a first interaction instruction to the target server, and interact with the first target object after receiving the second interaction instruction from the target server, that is, performer C interacts.
  • the first interaction instruction is an instruction to request the target server to display the first target interaction effect at the first target object, for example, requesting the first target object in the first target image to wear a hat
  • the second interaction instruction can be a data packet returning a corresponding wearing of a hat or a reply allowing display.
  • step S19 is performed to control the first AR display module to display the first target interaction effect on the first target object of the first target image according to the second interaction instruction.
  • the first target interaction effect can be rendered by the first processor and then displayed on the first target object.
  • the second processor further performs the following steps:
  • step S21 is executed to control the second image acquisition module to scan the target scene, and control the second UWB module to communicate with at least one third UWB module in the target scene.
  • step S22 is performed to obtain a second relative position of the third UWB module relative to the second UWB module.
  • step S23 is executed to control the second AR display module to display the second target image of the target scene, determine the second target position of the third UWB module in the second target image according to the second relative position, and determine the second target object at the second target position.
  • step S24 is executed to obtain the third target device information of the third UWB module, and send the third target device information to the target server.
  • step S25 is executed to send a second identity request instruction to the target server according to the third target identity information, and obtain the third target identity information corresponding to the third target device information from the target server.
  • step S26 is executed to control the second AR display module to display the third target identity information at the second target position of the second target image.
  • step S27 is executed to receive a second interaction instruction from the target server, and control the second AR display module to display a second target interaction effect on the second target object in the second target image.
  • the second target object and the first target object are the same performer C, and the second target interaction effect and the first target interaction effect are the same target interaction effects under different perspectives. Since viewers A and B are at different positions in the target scene and face performer C in different directions, there is a difference in perspective between the first target interaction effect rendered by the first processor and the second target interaction effect rendered by the second processor. For example, if viewer A is in front of performer C, viewer A can see performer C wearing a hat from the first AR display module, but only see the front of the hat. If viewer B is on the side of performer C, viewer B can see performer C wearing a hat from the second AR display module, but only see the side of the hat.
  • the target server can synchronously send the second interactive instruction sent to the first user terminal to the second user terminal, so that the viewer B holding the second user terminal can see the interactive effect initiated by the viewer A to the performer C.
  • the target server performs the following steps:
  • step S31 is performed to receive a first identity request instruction from a first user terminal, determine first target identity information according to the first identity request instruction, and send the first target identity information to the first user terminal.
  • the first target identity information is the same as the third target identity information.
  • step S32 is performed to receive a second identity request instruction from the second user terminal, determine the second target identity information according to the second identity request instruction, and send the second target identity information to the second user terminal.
  • the second target identity information is the same as the third target identity information.
  • step S33 the first user terminal can accept the first interaction instruction, determine that the first target identity information is the same as the second target identity information, and send the second interaction instruction to the first user terminal and the second user terminal. That is, after receiving the first interaction instruction, the target server determines that the account corresponding to the third target identity information is received within the preset time period.
  • the first user terminal and the second user terminal both request the target server to determine the third target identity information within the preset time, which indicates that the viewer A and the viewer B are in the same target scene, and the target server sends the second interaction instruction to the first user terminal and the second user terminal respectively, so that the viewer B can see the interaction effect initiated by the viewer A with the performer C.
  • the target server also stores the correspondence between accounts, target device relationships, and the number of tokens.
  • the first interaction instruction may be equal to a preset number of tokens.
  • the target server After receiving the first interaction instruction, the target server also updates the target account pointed to by the first interaction instruction, that is, updates the number of tokens of the account corresponding to the first UWB module.
  • the present invention is used for interaction at a performance site.
  • the first user terminal and the second user terminal can obtain the third target device information of the third UWB module through their respective UWB modules, thereby determining the performer through the third target device information, and then interacting through interactive instructions.
  • the image acquisition module and the AR display module realize the intuitive display of the target interactive effect of the interactive instruction, which facilitates the interaction between the audience and the performer at the performance site.
  • the target server when it receives a first interactive instruction from a viewer, it determines other viewers at the performance site where the viewer is located, and sends a second interactive instruction to the other viewers, so that the viewers at the same performance site can see the interactive effects initiated by other viewers to the performer, further facilitating the interaction between the audience and the performer at the performance site.
  • the computer device of this embodiment includes a processor and a memory.
  • the memory stores a computer program.
  • the processor executes the computer program, the above-mentioned interactive method of the performance site is implemented.
  • the computer device may include but is not limited to a processor and a memory. Those skilled in the art will appreciate that the computer device may include more or fewer components, or a combination of certain components, or different components, for example, the computer device may also include input and output devices, network access devices, buses, etc.
  • the processor may be a central processing unit (CPU), other general-purpose processors, digital signal processors (DSP), application-specific integrated circuits (ASIC), field programmable gate arrays (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general-purpose processor may be a microcontroller or any conventional processor, etc.
  • the processor is the control center of a computer device, and uses various interfaces and lines to connect various parts of the entire computer device.
  • the memory can be used to store computer programs and/or modules.
  • the controller realizes various functions of the computer device by running or executing the computer programs and/or modules stored in the memory, and calling the data stored in the memory.
  • the memory can mainly include a program storage area and a data storage area, wherein the program storage area can store an operating system, an application required for at least one function (such as a sound receiving function, a sound conversion to text function, etc.), etc.; the data storage area can store data created according to the use of the mobile phone (such as audio data, text data, etc.), etc.
  • the memory can include a high-speed random access memory, and can also include a non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a smart memory card (Smart Media Card, SMC), a secure digital (Secure Digital, SD) card, a flash card (Flash Card), at least one disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • a non-volatile memory such as a hard disk, a memory, a plug-in hard disk, a smart memory card (Smart Media Card, SMC), a secure digital (Secure Digital, SD) card, a flash card (Flash Card), at least one disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • a non-volatile memory such as a hard disk, a memory, a plug-in hard disk, a smart memory card (Smart Media Card, SMC), a secure digital (Secure Digital, SD) card, a flash card (
  • the module integrated in the computer device of the above embodiment is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer-readable storage medium.
  • all or part of the process of the interactive method embodiment of the performance site can also be completed by instructing the relevant hardware through a computer program.
  • the computer program can be stored in a computer-readable storage medium.
  • the steps of the interactive method of the performance site can be implemented.
  • the computer program includes computer program code, and the computer program code can be in source code form, object code form, executable file or some intermediate form.
  • the storage medium may include: any entity or device that can carry computer program code, recording medium, U disk, mobile hard disk, disk, optical disk, computer memory, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), electric carrier signal, telecommunication signal and software distribution medium.
  • ROM Read-Only Memory
  • RAM Random Access Memory
  • electric carrier signal telecommunication signal and software distribution medium.
  • the content contained in the computer-readable medium can be appropriately increased or decreased according to the requirements of legislation and patent practice in the jurisdiction. For example, in some jurisdictions, according to legislation and patent practice, the computer-readable medium does not include electric carrier signals and telecommunication signals.
  • the interactive method, computer device, and computer-readable storage medium at a performance site of the present invention are applicable to the field of Internet technology.
  • the third target device information of the performer is obtained through the UWB module of the viewer's user terminal, and the performer is determined from the target server through the third target device information, so that the viewer interacts with the performer through interactive instructions.
  • the interactive effect is synchronized to other viewers at the same performance site, thereby improving the interactive experience between the viewers and performers at the performance site.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Graphics (AREA)
  • Human Computer Interaction (AREA)
  • Information Transfer Between Computers (AREA)
  • Telephonic Communication Services (AREA)
  • Electrophonic Musical Instruments (AREA)

Abstract

The present invention provides an interaction method for a performance site, a computer apparatus, and a computer-readable storage medium. The interaction method for a performance site comprises: controlling a first image acquisition module to scan a target scene and controlling a first UWB module to communicate with a third UWB module; acquiring a first relative location of the third UWB module relative to the first UWB module; controlling a first AR display module to display a first target image of the target scene, determining a first target location of the third UWB module, and determining a first target object; acquiring third target device information of the third UWB module; according to the third target device information, sending a first identity request instruction to a target server, and acquiring third target identity information from the target server; controlling the first AR display module to display first target identity information; sending a first interaction instruction to the target server; and according to a second interaction instruction, controlling the first AR display module to display a first target interaction effect. The present invention can facilitate interaction between a performer and a viewer.

Description

演出现场的交互方法、计算机装置、计算机可读存储介质Interactive method at performance site, computer device, and computer-readable storage medium 技术领域Technical Field

本发明涉及互联网技术领域,具体是涉及一种演出现场的交互方法、计算机装置、计算机可读存储介质。本申请是基于申请日为2023年4月28日,申请号为CN202310482989.3的中国发明申请,该申请的内容引入本文作为参考。The present invention relates to the field of Internet technology, and more particularly to an interactive method for a performance site, a computer device, and a computer-readable storage medium. This application is based on a Chinese invention application filed on April 28, 2023, with application number CN202310482989.3, the contents of which are incorporated herein by reference.

背景技术Background Art

在演出现场,观赏者与表演者的交互往往限于语言,互动方式较为单一。例如,在演唱会演出现场,受到人数、视角、距离的限制,观赏者难以与表演者互动,当一个观赏者与表演者互动时,其他观赏者往往难以看到该观赏者与表演者之间进行的互动,影响观赏者的现场交互体验。At a performance, the interaction between the audience and the performers is often limited to language, and the interaction method is relatively simple. For example, at a concert, due to the limitations of the number of people, viewing angles, and distance, it is difficult for the audience to interact with the performers. When a viewer interacts with a performer, other viewers often find it difficult to see the interaction between the viewer and the performer, which affects the audience's on-site interactive experience.

技术问题Technical issues

本发明的第一目的是提供一种方便观赏者与表演者互动的演出现场的交互方法。The first object of the present invention is to provide an interactive method for a performance scene that facilitates interaction between viewers and performers.

本发明的第二目的是提供另一种方便观赏者与表演者互动的演出现场的交互方法。The second object of the present invention is to provide another interactive method at a performance site that facilitates interaction between viewers and performers.

本发明的第三目的是提供另一种方便观赏者与表演者互动的演出现场的交互方法。The third object of the present invention is to provide another interactive method at a performance site that facilitates interaction between viewers and performers.

本发明的第四目的是提供一种实现上述演出现场的交互方法的计算机装置。A fourth object of the present invention is to provide a computer device for implementing the above-mentioned on-site performance interaction method.

本发明的第五目的是提供一种包括上述演出现场的交互方法的计算机可读存储介质。A fifth object of the present invention is to provide a computer-readable storage medium including the above-mentioned performance scene interactive method.

技术解决方案Technical Solutions

为了实现上述的第一目的,本发明提供的一种演出现场的交互方法,其中,包括以下步骤; 控制第一图像采集模块扫描目标场景,控制第一UWB模块与目标场景内的至少一个第三UWB模块通信;获取第三UWB模块相对于第一UWB模块的第一相对位置;控制第一AR显示模块显示目标场景的第一目标图像,根据第一相对位置确定第一目标图像中第三UWB模块的第一目标位置,确定第一目标位置的第一目标对象;获取第三UWB模块的第三目标设备信息;根据第三目标设备信息向目标服务器发送第一身份请求指令,从目标服务器获取第三目标设备信息对应的第三目标身份信息;控制第一AR显示模块在第一目标图像的第一目标位置显示第三目标身份信息;向目标服务器发送第一交互指令,从目标服务器接收第二交互指令;根据第二交互指令,控制第一AR显示模块在第一目标图像的第一目标对象显示第一目标交互效果。In order to achieve the above-mentioned first purpose, the present invention provides an interactive method for a performance site, which includes the following steps: controlling the first image acquisition module to scan the target scene, controlling the first UWB module to communicate with at least one third UWB module in the target scene; obtaining the first relative position of the third UWB module relative to the first UWB module; controlling the first AR display module to display the first target image of the target scene, determining the first target position of the third UWB module in the first target image according to the first relative position, and determining the first target object at the first target position; obtaining the third target device information of the third UWB module; sending a first identity request instruction to the target server according to the third target device information, and obtaining the third target identity information corresponding to the third target device information from the target server; controlling the first AR display module to display the third target identity information at the first target position of the first target image; sending a first interaction instruction to the target server, and receiving a second interaction instruction from the target server; according to the second interaction instruction, controlling the first AR display module to display the first target interaction effect at the first target object of the first target image.

进一步的方案是,控制AR显示模块显示目标图像的目标图像,根据相对位置确定目标图像中第三UWB模块的目标位置时,根据预设图像识别算法确定目标位置的目标对象。A further solution is to control the AR display module to display the target image of the target image, and when determining the target position of the third UWB module in the target image according to the relative position, determine the target object of the target position according to a preset image recognition algorithm.

进一步的方案是,在控制图像采集模块扫描目标场景之前,从目标服务器接收表演地点信息。A further solution is to receive performance location information from a target server before controlling the image acquisition module to scan the target scene.

为了实现上述的第二目的,本发明提供的一种演出现场的交互方法,其中,包括以下步骤:从第一用户终端接收第一身份请求指令,根据第一身份请求指令确定第一目标身份信息,发送第一目标身份信息至第一用户终端;从第二用户终端接收身份请求指令,根据第二身份请求指令确定第二目标身份信息,发送第二目标身份信息至第二用户终端;从第一用户终端接收第一交互指令,确定第一目标身份信息与第二身份信息相同,发送第二交互指令至第一用户终端和第二用户终端。In order to achieve the above-mentioned second purpose, the present invention provides an interactive method for a performance site, which includes the following steps: receiving a first identity request instruction from a first user terminal, determining first target identity information according to the first identity request instruction, and sending the first target identity information to the first user terminal; receiving an identity request instruction from a second user terminal, determining second target identity information according to the second identity request instruction, and sending the second target identity information to the second user terminal; receiving a first interaction instruction from the first user terminal, determining that the first target identity information is the same as the second identity information, and sending a second interaction instruction to the first user terminal and the second user terminal.

进一步的方案是,接收第一交互指令后,更新第一交互指令指向的目标账户。A further solution is to update the target account pointed to by the first interaction instruction after receiving the first interaction instruction.

为了实现上述的第三目的,本发明提供的一种演出现场的交互方法,其中,包括以下步骤;控制第二图像采集模块扫描目标场景,控制第二UWB模块与目标场景内的至少一个第三UWB模块通信;获取第三UWB模块相对于第二UWB模块的第二相对位置;控制第二AR显示模块显示目标场景的第二目标图像,根据第二相对位置确定第二目标图像中第三UWB模块的第二目标位置,确定第二目标位置的第二目标对象;获取第三UWB模块的第三目标设备信息,发送第三目标设备信息至目标服务器;根据第三目标身份信息向目标服务器发送第二身份请求指令,从目标服务器获取第三目标设备信息对应的第三目标身份信息;控制第二AR显示模块在第二目标图像的第二目标位置显示第三目标身份信息;从目标服务器接收第二交互指令,控制第二AR显示模块在第二目标图像的第二目标对象显示第二目标交互效果。In order to achieve the above-mentioned third purpose, the present invention provides an interactive method for a performance site, which includes the following steps: controlling the second image acquisition module to scan the target scene, controlling the second UWB module to communicate with at least one third UWB module in the target scene; obtaining the second relative position of the third UWB module relative to the second UWB module; controlling the second AR display module to display the second target image of the target scene, determining the second target position of the third UWB module in the second target image according to the second relative position, and determining the second target object at the second target position; obtaining the third target device information of the third UWB module, and sending the third target device information to the target server; sending a second identity request instruction to the target server according to the third target identity information, and obtaining the third target identity information corresponding to the third target device information from the target server; controlling the second AR display module to display the third target identity information at the second target position of the second target image; receiving the second interaction instruction from the target server, and controlling the second AR display module to display the second target interaction effect at the second target object of the second target image.

为了实现上述的第四目的,本发明提供的一种计算机装置,包括存储器与处理器,处理器存储有计算机程序,其中:处理器执行存储器中的计算机程序时,实现上述的演出现场的交互方法。In order to achieve the fourth purpose mentioned above, the present invention provides a computer device, including a memory and a processor, wherein the processor stores a computer program, wherein: when the processor executes the computer program in the memory, the above-mentioned interactive method of the performance site is implemented.

为了实现上述的第五目的,本发明提供的一种计算机可读存储介质,存储有计算机程序,其中:计算机程序被处理器执行时,实现上述的演出现场的交互方法。In order to achieve the fifth purpose mentioned above, the present invention provides a computer-readable storage medium storing a computer program, wherein: when the computer program is executed by a processor, the above-mentioned interactive method of the performance site is implemented.

有益效果Beneficial Effects

本发明通过第一UWB模块与第三UWB模块通信,可以获取第三UWB模块的第三目标设备信息,进而确定第三目标设备信息对应的第三目标身份信息,结合图像采集模块与第一AR显示模块,发起第一交互指令后,可以直观地在第一目标对象呈现第一目标交互效果,方便观赏者与表演者交互,提高交互体验。The present invention can obtain the third target device information of the third UWB module through communication between the first UWB module and the third UWB module, and then determine the third target identity information corresponding to the third target device information. After initiating the first interaction instruction by combining the image acquisition module and the first AR display module, the first target interaction effect can be intuitively presented on the first target object, which facilitates the interaction between the viewer and the performer and improves the interactive experience.

另外,结合图像识别算法,可以更准确地确定持有第三UWB模块的对象。In addition, combined with the image recognition algorithm, the object holding the third UWB module can be determined more accurately.

并且,可以方便观赏者前往表演现场。Moreover, it is convenient for the audience to go to the performance site.

本发明通过确定第一目标身份信息与第二身份信息是否相同,从而确认持有第一用户终端与第二用户终端的观赏者在统一目标场景,且均通过图像采集模块与AR显示模块获取包括表演者的目标图像,从而在接收到第一用户终端发送的第一交互指令后,向在该目标场景内的第一用户终端与第二用户终端均发送第二交互指令,从而使得持有第二用户终端的观赏者可以看到持有第一用户终端的观赏者发起的交互效果,提高目标场景内观赏者的交互体验。The present invention determines whether the first target identity information and the second identity information are the same, thereby confirming that the viewers holding the first user terminal and the second user terminal are in the same target scene, and both obtain the target image including the performer through the image acquisition module and the AR display module, so that after receiving the first interaction instruction sent by the first user terminal, the second interaction instruction is sent to the first user terminal and the second user terminal in the target scene, so that the viewer holding the second user terminal can see the interaction effect initiated by the viewer holding the first user terminal, thereby improving the interaction experience of the viewer in the target scene.

本发明通过第二UWB模块与第三UWB模块通信,可以获取第三UWB模块的第三目标设备信息,进而确定第三目标设备信息对应的第三目标身份信息,结合图像采集模块与第一AR显示模块,根据接收到的第二交互指令,可以直观地在第二目标对象呈现第二目标交互效果,提高交互体验。The present invention can obtain the third target device information of the third UWB module through communication between the second UWB module and the third UWB module, and then determine the third target identity information corresponding to the third target device information. Combined with the image acquisition module and the first AR display module, according to the received second interaction instruction, the second target interaction effect can be intuitively presented on the second target object, thereby improving the interaction experience.

附图说明BRIEF DESCRIPTION OF THE DRAWINGS

图1是本发明演出现场实施例的演出现场的交互系统架构图。FIG. 1 is a diagram of an interactive system architecture of a performance scene according to a performance scene embodiment of the present invention.

图2是本发明演出现场实施例中第一处理器执行的流程图。FIG. 2 is a flow chart executed by the first processor in the live performance embodiment of the present invention.

图3是本发明演出现场实施例中第二处理器执行的流程图。FIG. 3 is a flow chart executed by the second processor in the live performance embodiment of the present invention.

图4是本发明演出现场实施例中目标服务器执行的流程图。FIG. 4 is a flow chart executed by the target server in the performance scene embodiment of the present invention.

以下结合附图及实施例对本发明作进一步说明。The present invention is further described below in conjunction with the accompanying drawings and embodiments.

本发明的最佳实施方式Best Mode for Carrying Out the Invention

本发明基于AR(Augmented Reality)技术和UWB(Ultra Wide Band)技术实现演出现场的交互方法,提高观赏者在演出现场的观看体验。本发明还提供实现上述演出现场的交互方法的计算机装置和计算机可读存储介质。The present invention implements an interactive method for a performance scene based on AR (Augmented Reality) technology and UWB (Ultra Wide Band) technology to improve the viewing experience of viewers at the performance scene. The present invention also provides a computer device and a computer-readable storage medium for implementing the interactive method for the performance scene.

演出现场的交互方法实施例:An embodiment of the interactive method at a performance site:

参见图1,本实施例的演出现场的交互方法基于演出现场的交互系统实现,演出现场的交互系统1包括第一用户终端11、第二用户终端21、第三用户终端31,目标服务器41,第一用户终端11、第二用户终端21、第三用户终端31均可以与目标服务器41进行无线通信。Referring to Figure 1, the interactive method at the performance site of this embodiment is implemented based on the interactive system at the performance site. The interactive system 1 at the performance site includes a first user terminal 11, a second user terminal 21, a third user terminal 31, and a target server 41. The first user terminal 11, the second user terminal 21, and the third user terminal 31 can all communicate wirelessly with the target server 41.

第一用户终端11包括第一图像采集模块111、第一AR显示模块112、第一UWB模块113、第一处理器114,第一处理器114分别连接第一图像采集模块111、第一AR显示模块112、第一UWB模块113。The first user terminal 11 includes a first image acquisition module 111, a first AR display module 112, a first UWB module 113, and a first processor 114. The first processor 114 is connected to the first image acquisition module 111, the first AR display module 112, and the first UWB module 113 respectively.

第二用户终端21包括第二图像采集模块211、第二AR显示模块212、第二UWB模块213、第二处理器214分别连接第二图像采集模块211、第二AR显示模块212、第二UWB模块213。The second user terminal 21 includes a second image acquisition module 211 , a second AR display module 212 , a second UWB module 213 , and a second processor 214 , which are respectively connected to the second image acquisition module 211 , the second AR display module 212 , and the second UWB module 213 .

第三用户终端31包括第三UWB模块313。The third user terminal 31 includes a third UWB module 313 .

第一UWB模块113与第二UWB模块213均可以与第三UWB模块313进行UWB通信。Both the first UWB module 113 and the second UWB module 213 can perform UWB communication with the third UWB module 313 .

本实施例中,第一用户终端11、第二用户终端21、第三用户终端均可以是智能手机,则第一图像采集模块111与第二图像采集模块211均可以是智能手机上的摄像头模块,第一AR显示模块112与第二AR显示模块212均可以是智能手机上的显示模块。In this embodiment, the first user terminal 11, the second user terminal 21, and the third user terminal can all be smart phones, then the first image acquisition module 111 and the second image acquisition module 211 can both be camera modules on the smart phone, and the first AR display module 112 and the second AR display module 212 can both be display modules on the smart phone.

第一UWB模块113包括唯一的设备编号,即第一目标设备信息。第二UWB模块213包括唯一的设备编号,即第二目标设备信息。第三UWB模块213包括唯一的设备编号,即第三目标设备信息。The first UWB module 113 includes a unique device number, namely, first target device information. The second UWB module 213 includes a unique device number, namely, second target device information. The third UWB module 213 includes a unique device number, namely, third target device information.

目标服务器41上存储有账号、UWB模块的目标设备信息、目标身份信息之间的对应关系。即目标服务器41上存储有账号与UWB模块的对应关系,用户可以在目标服务器上注册账户并绑定一个或多个UWB模块的设备信息。本实施例中,观赏者A持有第一用户终端11,观赏者B持有第二用户终端21、表演者C持有第三用户终端31。目标服务器41上存储有观赏者A的账号A与第一目标设备信息的对应关系、观赏者B的账号B与第二目标设备信息的对应关系、表演者C的账号C与第三目标设备信息的对应关系。此外,表演者C的账号C还存储有与第三目标设备信息对应的第三目标身份信息,从而使得观赏者A与观赏者B均可以通过获取观赏者C的第三目标设备信息向目标服务器41请求表演者C的第三目标身份信息,确定表演者C的身份。The target server 41 stores the correspondence between the account, the target device information of the UWB module, and the target identity information. That is, the target server 41 stores the correspondence between the account and the UWB module, and the user can register an account on the target server and bind the device information of one or more UWB modules. In this embodiment, viewer A holds the first user terminal 11, viewer B holds the second user terminal 21, and performer C holds the third user terminal 31. The target server 41 stores the correspondence between viewer A's account A and the first target device information, the correspondence between viewer B's account B and the second target device information, and the correspondence between performer C's account C and the third target device information. In addition, performer C's account C also stores the third target identity information corresponding to the third target device information, so that both viewer A and viewer B can request performer C's third target identity information from the target server 41 by obtaining viewer C's third target device information to determine performer C's identity.

参见图2,实现本实施例的演出现场的交互方法,具体包括第一处理器执行的以下步骤:Referring to FIG. 2 , the interactive method for implementing the performance scene of this embodiment specifically includes the following steps executed by the first processor:

首先执行步骤S11,从目标服务器接收表演地点信息。表演地点信息即表演的时间、地点、简介等信息,目标服务器可以是从表演者C,即第三用户终端接收表演地点信息,并推送到观赏者A的第一用户终端。First, step S11 is executed to receive performance location information from the target server. The performance location information is information such as the time, location, and introduction of the performance. The target server may receive the performance location information from the performer C, i.e., the third user terminal, and push it to the first user terminal of the viewer A.

然后执行步骤S12,控制第一图像采集模块扫描目标场景,并控制第一UWB模块与目标场景内的至少一个第三UWB模块通信。Then, step S12 is executed to control the first image acquisition module to scan the target scene, and control the first UWB module to communicate with at least one third UWB module in the target scene.

然后执行步骤S13,获取第三UWB模块相对于第一UWB模块的第一相对位置。基于UWB通信的原理,可以知道第一UWB模块与第三UWB模块之间的距离及角度。Then, step S13 is performed to obtain a first relative position of the third UWB module relative to the first UWB module. Based on the principle of UWB communication, the distance and angle between the first UWB module and the third UWB module can be known.

然后执行步骤S14,控制第一AR显示模块显示目标场景的第一目标图像,根据第一相对位置确定第一目标图像中第三UWB模块的第一目标位置,确定第一目标位置的第一目标对象。第一目标图像即通过第一图像采集模块扫描目标场景直接得到的图像,第三UWB模块在该目标图像内,根据现有的图像识别算法,例如YOLOv5,可以识别目标图像内的人以及物体,根据现有的摄像头测距原理,例如双目测距原理,可以得到目标图像内的人以及物体相对第一用户终端的距离,由于已知第三UWB模块相对于第一UWB模块的相对位置,即已知第三UWB模块相对于第一用户终端的第一相对位置,可以确定目标图像中持有第三UWB模块的人,该持有第三UWB模块的人在第一目标图像的位置即第一目标位置,该持有第三UWB模块的人即第三目标对象。Then, step S14 is executed to control the first AR display module to display the first target image of the target scene, determine the first target position of the third UWB module in the first target image according to the first relative position, and determine the first target object at the first target position. The first target image is an image directly obtained by scanning the target scene by the first image acquisition module. The third UWB module is in the target image. According to the existing image recognition algorithm, such as YOLOv5, the person and object in the target image can be identified. According to the existing camera ranging principle, such as the binocular ranging principle, the distance of the person and object in the target image relative to the first user terminal can be obtained. Since the relative position of the third UWB module relative to the first UWB module is known, that is, the first relative position of the third UWB module relative to the first user terminal is known, the person holding the third UWB module in the target image can be determined. The person holding the third UWB module is at the first target position of the first target image, and the person holding the third UWB module is the third target object.

然后执行步骤S15,获取第三UWB模块的第三目标设备信息。Then, step S15 is executed to obtain the third target device information of the third UWB module.

然后执行步骤S16,根据第三目标设备信息向目标服务器发送第一身份请求指令,从目标服务器获取第三目标设备信息对应的第三目标身份信息。即向目标服务器请求获取第三目标设备信息对应的第三目标身份信息,从而确认第三目标对象的具体身份。Then, step S16 is executed to send a first identity request instruction to the target server according to the third target device information, and obtain the third target identity information corresponding to the third target device information from the target server, that is, request the target server to obtain the third target identity information corresponding to the third target device information, so as to confirm the specific identity of the third target object.

然后执行步骤S17,控制第一AR显示模块在第一目标图像的第一目标位置显示第三目标身份信息。由此,观赏者A可以从第一AR显示模块中直观看到第一目标对象的第三目标身份信息,该第三目标身份信息可以包括姓名等身份信息。Then, step S17 is executed to control the first AR display module to display the third target identity information at the first target position of the first target image. Thus, the viewer A can directly see the third target identity information of the first target object from the first AR display module, and the third target identity information may include identity information such as name.

然后执行步骤S18,向目标服务器发送第一交互指令,从目标服务器接收第二交互指令。即观赏者A可以选择向目标服务器发送第一交互指令,在收到目标服务器的第二交互指令后与第一目标对象进行交互,即表演者C进行交互。第一交互指令是向目标服务器请求在第一目标对象处进行第一目标交互效果显示的指令,例如请求在第一目标图像的第一目标对象穿戴一顶帽子,第二交互指令可以是返回相应穿戴一顶帽子的数据包或是返回允许显示的答复。Then, step S18 is executed to send a first interaction instruction to the target server and receive a second interaction instruction from the target server. That is, viewer A can choose to send a first interaction instruction to the target server, and interact with the first target object after receiving the second interaction instruction from the target server, that is, performer C interacts. The first interaction instruction is an instruction to request the target server to display the first target interaction effect at the first target object, for example, requesting the first target object in the first target image to wear a hat, and the second interaction instruction can be a data packet returning a corresponding wearing of a hat or a reply allowing display.

最后执行步骤S19,根据第二交互指令,控制第一AR显示模块在第一目标图像的第一目标对象显示第一目标交互效果。第一目标交互效果可以是由第一处理器渲染,进而在第一目标对象上进行显示。Finally, step S19 is performed to control the first AR display module to display the first target interaction effect on the first target object of the first target image according to the second interaction instruction. The first target interaction effect can be rendered by the first processor and then displayed on the first target object.

由上述步骤可以实现观赏者A与表演者C在演出现场,即目标场景的交互,同时,目标场景内还存在观赏者B,观赏者B与观赏者A在不同的位置观看表演者C的演出,实现本实施例的演出现场的交互方法,参照图3,第二处理器还执行以下步骤:The above steps can realize the interaction between viewer A and performer C at the performance scene, that is, the target scene. At the same time, there is also viewer B in the target scene. Viewer B and viewer A watch the performance of performer C at different positions, realizing the interactive method of the performance scene of this embodiment. Referring to FIG. 3, the second processor further performs the following steps:

首先执行步骤S21,控制第二图像采集模块扫描目标场景,控制第二UWB模块与目标场景内的至少一个第三UWB模块通信。First, step S21 is executed to control the second image acquisition module to scan the target scene, and control the second UWB module to communicate with at least one third UWB module in the target scene.

然后执行步骤S22,获取第三UWB模块相对于第二UWB模块的第二相对位置。Then, step S22 is performed to obtain a second relative position of the third UWB module relative to the second UWB module.

然后执行步骤S23,控制第二AR显示模块显示目标场景的第二目标图像,根据第二相对位置确定第二目标图像中第三UWB模块的第二目标位置,确定第二目标位置的第二目标对象。Then, step S23 is executed to control the second AR display module to display the second target image of the target scene, determine the second target position of the third UWB module in the second target image according to the second relative position, and determine the second target object at the second target position.

然后执行步骤S24,获取第三UWB模块的第三目标设备信息,发送第三目标设备信息至目标服务器。Then, step S24 is executed to obtain the third target device information of the third UWB module, and send the third target device information to the target server.

然后执行步骤S25,根据第三目标身份信息向目标服务器发送第二身份请求指令,从目标服务器获取第三目标设备信息对应的第三目标身份信息。Then, step S25 is executed to send a second identity request instruction to the target server according to the third target identity information, and obtain the third target identity information corresponding to the third target device information from the target server.

然后执行步骤S26,控制第二AR显示模块在第二目标图像的第二目标位置显示第三目标身份信息。Then, step S26 is executed to control the second AR display module to display the third target identity information at the second target position of the second target image.

最后执行步骤S27,从目标服务器接收第二交互指令,控制第二AR显示模块在第二目标图像的第二目标对象显示第二目标交互效果。Finally, step S27 is executed to receive a second interaction instruction from the target server, and control the second AR display module to display a second target interaction effect on the second target object in the second target image.

上述步骤S21至步骤S27的具体执行过程,参照上述步骤S11至步骤S19进行,在此不再赘述。需要说明的是,第二目标对象与第一目标对象均是相同的表演者C,第二目标交互效果与第一目标交互效果是不同视角下的同一目标交互效果,由于观赏者A与观赏者B处在目标场景内的不同位置,面对表演者C的方位不同,第一处理器渲染的第一目标交互效果与第二处理器渲染的第二目标交互效果存在视角上的差异,例如观赏者A在表演者C的正面,则观赏者A可以从第一AR显示模块可以看到表演者C戴上一顶帽子,但是仅看到帽子的正面,观赏者B在表演者C的侧面,则观赏者B可以从第二AR显示模块可以看到表演者C戴上一顶帽子,但是仅看到帽子的侧面。The specific execution process of the above steps S21 to S27 is performed with reference to the above steps S11 to S19, which will not be repeated here. It should be noted that the second target object and the first target object are the same performer C, and the second target interaction effect and the first target interaction effect are the same target interaction effects under different perspectives. Since viewers A and B are at different positions in the target scene and face performer C in different directions, there is a difference in perspective between the first target interaction effect rendered by the first processor and the second target interaction effect rendered by the second processor. For example, if viewer A is in front of performer C, viewer A can see performer C wearing a hat from the first AR display module, but only see the front of the hat. If viewer B is on the side of performer C, viewer B can see performer C wearing a hat from the second AR display module, but only see the side of the hat.

在上述演出现场的交互方法中,目标服务器可以将发送至第一用户终端的第二交互指令同步发送至第二用户终端,从而可以使得持有第二用户终端的观赏者B可以看到观赏者A向表演者C发起的交互效果,具体的,参照图4,目标服务器执行以下步骤:In the above-mentioned interactive method at the performance site, the target server can synchronously send the second interactive instruction sent to the first user terminal to the second user terminal, so that the viewer B holding the second user terminal can see the interactive effect initiated by the viewer A to the performer C. Specifically, referring to FIG. 4 , the target server performs the following steps:

首先执行步骤S31,从第一用户终端接收第一身份请求指令,根据第一身份请求指令确定第一目标身份信息,发送第一目标身份信息至第一用户终端。本实施例中,第一目标身份信息与第三目标身份信息相同。First, step S31 is performed to receive a first identity request instruction from a first user terminal, determine first target identity information according to the first identity request instruction, and send the first target identity information to the first user terminal. In this embodiment, the first target identity information is the same as the third target identity information.

然后执行步骤S32,从第二用户终端接收第二身份请求指令,根据第二身份请求指令确定第二目标身份信息,发送第二目标身份信息至第二用户终端。本实施例中,第二目标身份信息与第三目标身份信息相同。Then, step S32 is performed to receive a second identity request instruction from the second user terminal, determine the second target identity information according to the second identity request instruction, and send the second target identity information to the second user terminal. In this embodiment, the second target identity information is the same as the third target identity information.

最后执行步骤S33,从第一用户终端能接受第一交互指令,确定第一目标身份信息与第二目标身份信息相同,发送第二交互指令至第一用户终端与第二用户终端。即目标服务器接收到第一交互指令后,确定在预设时间段内收到确定第三目标身份信息对应的账户,在本实施例中,第一用户终端与第二用户终端均在预设时间内向目标服务器请求确定第三目标身份信息,则表明观赏者A与观赏者B在同一目标场景,则目标服务器将第二交互指令分别发送至第一用户终端与第二用户终端,从而使得观赏者B看到观赏者A发起的与表演者C的交互效果。Finally, step S33 is executed, the first user terminal can accept the first interaction instruction, determine that the first target identity information is the same as the second target identity information, and send the second interaction instruction to the first user terminal and the second user terminal. That is, after receiving the first interaction instruction, the target server determines that the account corresponding to the third target identity information is received within the preset time period. In this embodiment, the first user terminal and the second user terminal both request the target server to determine the third target identity information within the preset time, which indicates that the viewer A and the viewer B are in the same target scene, and the target server sends the second interaction instruction to the first user terminal and the second user terminal respectively, so that the viewer B can see the interaction effect initiated by the viewer A with the performer C.

可选的是,目标服务器还存储有账户、目标设备关系、代币数量之间的对应关系,第一交互指令可以等额为预设数量的代币,在目标服务器在接收到第一交互指令后,还更新第一交互指令指向的目标账户,即更新的第一UWB模块对应的账户的代币数量。Optionally, the target server also stores the correspondence between accounts, target device relationships, and the number of tokens. The first interaction instruction may be equal to a preset number of tokens. After receiving the first interaction instruction, the target server also updates the target account pointed to by the first interaction instruction, that is, updates the number of tokens of the account corresponding to the first UWB module.

综上所述,本发明用于演出现场的交互,第一用户终端与第二用户终端均可以通过各自的UWB模块获取第三UWB模块的第三目标设备信息,从而通过第三目标设备信息确定表演者,然后通过交互指令进行交互,通过图像采集模块与AR显示模块实现交互指令的目标交互效果的直观显示,方便演出现场的观赏者与演出者的交互。同时,当目标服务器接收到一个观赏者的第一交互指令时,确定该观赏者所在的演出现场的其他观赏者,并向其他观赏者发送第二交互指令,从而使得同一演出现场的观赏者可以看到其他观赏者向表演者发起的交互效果,进一步方便演出现场的观赏者与演出者的交互。In summary, the present invention is used for interaction at a performance site. The first user terminal and the second user terminal can obtain the third target device information of the third UWB module through their respective UWB modules, thereby determining the performer through the third target device information, and then interacting through interactive instructions. The image acquisition module and the AR display module realize the intuitive display of the target interactive effect of the interactive instruction, which facilitates the interaction between the audience and the performer at the performance site. At the same time, when the target server receives a first interactive instruction from a viewer, it determines other viewers at the performance site where the viewer is located, and sends a second interactive instruction to the other viewers, so that the viewers at the same performance site can see the interactive effects initiated by other viewers to the performer, further facilitating the interaction between the audience and the performer at the performance site.

计算机装置实施例:Computer device embodiment:

本实施例的计算机装置包括处理器与存储器,存储器存储有计算机程序,处理器执行计算机程序时实现上述的演出现场的交互方法。The computer device of this embodiment includes a processor and a memory. The memory stores a computer program. When the processor executes the computer program, the above-mentioned interactive method of the performance site is implemented.

计算机装置可包括但不限于处理器与存储器。本领域技术人员可以理解,计算机装置可以包括更多或更少的部件,或者组合某些部件,或者不同的部件,例如计算机装置还可以包括输入输出设备、网络接入设备、总线等。The computer device may include but is not limited to a processor and a memory. Those skilled in the art will appreciate that the computer device may include more or fewer components, or a combination of certain components, or different components, for example, the computer device may also include input and output devices, network access devices, buses, etc.

例如,处理器可以是中央处理单元(Central Processing Unit,CPU),还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现成可编程门阵列(Field Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微控制器或者该处理器也可以是任何常规的处理器等。处理器是计算机装置的控制中心,利用各种接口和线路连接整个计算机装置的各个部分。For example, the processor may be a central processing unit (CPU), other general-purpose processors, digital signal processors (DSP), application-specific integrated circuits (ASIC), field programmable gate arrays (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. A general-purpose processor may be a microcontroller or any conventional processor, etc. The processor is the control center of a computer device, and uses various interfaces and lines to connect various parts of the entire computer device.

存储器可用于存储计算机程序和/或模块,控制器通过运行或执行存储在存储器内的计算机程序和/或模块,以及调用存储在存储器内的数据,实现计算机装置的各种功能。例如,存储器可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(例如声音接收功能、声音转换成文字功能等)等;存储数据区可存储根据手机的使用所创建的数据(例如音频数据、文本数据等)等。此外,存储器可以包括高速随机存取存储器,还可以包括非易失性存储器,例如硬盘、内存、插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)、至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。The memory can be used to store computer programs and/or modules. The controller realizes various functions of the computer device by running or executing the computer programs and/or modules stored in the memory, and calling the data stored in the memory. For example, the memory can mainly include a program storage area and a data storage area, wherein the program storage area can store an operating system, an application required for at least one function (such as a sound receiving function, a sound conversion to text function, etc.), etc.; the data storage area can store data created according to the use of the mobile phone (such as audio data, text data, etc.), etc. In addition, the memory can include a high-speed random access memory, and can also include a non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a smart memory card (Smart Media Card, SMC), a secure digital (Secure Digital, SD) card, a flash card (Flash Card), at least one disk storage device, a flash memory device, or other volatile solid-state storage devices.

计算机可读存储介质实施例:Computer readable storage medium embodiment:

上述实施例的计算机装置集成的模块如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读存储介质中。基于这样的理解,实现演出现场的交互方法实施例的全部或部分流程,也可以通过计算机程序来指令相关的硬件来完成,计算机程序可存储于一计算机可读存储介质中,该计算机程序在被控制器执行时,可实现上述演出现场的交互方法的步骤。其中,计算机程序包括计算机程序代码,计算机程序代码可以为源代码形式、对象代码形式、可执行文件或某些中间形式等。存储介质可以包括:能够携带计算机程序代码的任何实体或装置、记录介质、U盘、移动硬盘、磁碟、光盘、计算机存储器、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、电载波信号、电信信号以及软件分发介质等。需要说明的是,计算机可读介质包含的内容可以根据司法管辖区内立法和专利实践的要求进行适当的增减,例如在某些司法管辖区,根据立法和专利实践,计算机可读介质不包括电载波信号和电信信号。If the module integrated in the computer device of the above embodiment is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer-readable storage medium. Based on this understanding, all or part of the process of the interactive method embodiment of the performance site can also be completed by instructing the relevant hardware through a computer program. The computer program can be stored in a computer-readable storage medium. When the computer program is executed by the controller, the steps of the interactive method of the performance site can be implemented. Among them, the computer program includes computer program code, and the computer program code can be in source code form, object code form, executable file or some intermediate form. The storage medium may include: any entity or device that can carry computer program code, recording medium, U disk, mobile hard disk, disk, optical disk, computer memory, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), electric carrier signal, telecommunication signal and software distribution medium. It should be noted that the content contained in the computer-readable medium can be appropriately increased or decreased according to the requirements of legislation and patent practice in the jurisdiction. For example, in some jurisdictions, according to legislation and patent practice, the computer-readable medium does not include electric carrier signals and telecommunication signals.

工业实用性Industrial Applicability

本发明的演出现场的交互方法、计算机装置、计算机可读存储介质,适用于互联网技术领域,通过观赏者的用户终端的UWB模块获取表演者的第三目标设备信息,从而通过第三目标设备信息从目标服务器确定表演者,从而观赏者通过交互指令与表演者进行交互,当一个观赏者与表演者进行交互时,将交互效果同步到相同演出现场的其他观赏者,提高演出现场的观赏者与表演者之间的交互体验。The interactive method, computer device, and computer-readable storage medium at a performance site of the present invention are applicable to the field of Internet technology. The third target device information of the performer is obtained through the UWB module of the viewer's user terminal, and the performer is determined from the target server through the third target device information, so that the viewer interacts with the performer through interactive instructions. When a viewer interacts with the performer, the interactive effect is synchronized to other viewers at the same performance site, thereby improving the interactive experience between the viewers and performers at the performance site.

Claims (8)

 一种演出现场的交互方法,其特征在于,包括以下步骤;An interactive method for a performance scene, characterized by comprising the following steps; 控制第一图像采集模块扫描目标场景,控制第一UWB模块与所述目标场景内的至少一个第三UWB模块通信;Controlling the first image acquisition module to scan a target scene, and controlling the first UWB module to communicate with at least one third UWB module in the target scene; 获取所述第三UWB模块相对于所述第一UWB模块的第一相对位置;Acquire a first relative position of the third UWB module relative to the first UWB module; 控制第一AR显示模块显示所述目标场景的第一目标图像,根据所述第一相对位置确定所述第一目标图像中所述第三UWB模块的第一目标位置,确定所述第一目标位置的第一目标对象;Controlling the first AR display module to display a first target image of the target scene, determining a first target position of the third UWB module in the first target image according to the first relative position, and determining a first target object at the first target position; 获取所述第三UWB模块的第三目标设备信息;Acquire third target device information of the third UWB module; 根据所述第三目标设备信息向目标服务器发送第一身份请求指令,从所述目标服务器获取所述第三目标设备信息对应的第三目标身份信息;Sending a first identity request instruction to a target server according to the third target device information, and obtaining third target identity information corresponding to the third target device information from the target server; 控制所述第一AR显示模块在所述第一目标图像的所述第一目标位置显示所述第三目标身份信息;Controlling the first AR display module to display the third target identity information at the first target position of the first target image; 向所述目标服务器发送第一交互指令,从所述目标服务器接收第二交互指令;Sending a first interaction instruction to the target server, and receiving a second interaction instruction from the target server; 根据所述第二交互指令,控制所述第一AR显示模块在所述第一目标图像的所述第一目标对象显示第一目标交互效果。According to the second interaction instruction, the first AR display module is controlled to display a first target interaction effect on the first target object in the first target image.  如权利要求1所述的演出现场的交互方法,其特征在于;The interactive method for a performance site as described in claim 1 is characterized by: 控制所述AR显示模块显示所述目标场景的所述第一目标图像,根据所述第一相对位置确定所述第一目标图像中所述第三UWB模块的所述目标位置时,根据预设图像识别算法确定所述第一目标位置的目标对象。The AR display module is controlled to display the first target image of the target scene, and when the target position of the third UWB module in the first target image is determined according to the first relative position, the target object of the first target position is determined according to a preset image recognition algorithm.  如权利要求1所述的演出现场的交互方法,其特征在于;The interactive method for a performance site as described in claim 1 is characterized by: 在控制所述图像采集模块扫描所述目标场景之前,从所述目标服务器接收表演地点信息。Before controlling the image acquisition module to scan the target scene, performance location information is received from the target server.  一种演出现场的交互方法,其特征在于,包括以下步骤:A method for interactive performance on site, characterized in that it comprises the following steps: 从第一用户终端接收第一身份请求指令,根据所述第一身份请求指令确定第一目标身份信息,发送所述第一目标身份信息至所述第一用户终端;receiving a first identity request instruction from a first user terminal, determining first target identity information according to the first identity request instruction, and sending the first target identity information to the first user terminal; 从第二用户终端接收身份请求指令,根据所述第二身份请求指令确定所述第二目标身份信息,发送所述第二目标身份信息至所述第二用户终端;receiving an identity request instruction from a second user terminal, determining the second target identity information according to the second identity request instruction, and sending the second target identity information to the second user terminal; 从所述第一用户终端接收第一交互指令,确定所述第一目标身份信息与所述第二身份信息相同,发送第二交互指令至所述第一用户终端和所述第二用户终端。A first interaction instruction is received from the first user terminal, it is determined that the first target identity information is the same as the second identity information, and a second interaction instruction is sent to the first user terminal and the second user terminal.  如权利要求4所述的一种演出现场的交互方法,其特征在于;An interactive method for a performance site as described in claim 4, characterized in that; 接收所述第一交互指令后,更新所述第一交互指令指向的目标账户。After receiving the first interaction instruction, the target account pointed to by the first interaction instruction is updated.  一种演出现场的交互方法,其特征在于,包括以下步骤;An interactive method for a performance scene, characterized by comprising the following steps; 控制第二图像采集模块扫描目标场景,控制第二UWB模块与所述目标场景内的至少一个第三UWB模块通信;Controlling the second image acquisition module to scan a target scene, and controlling the second UWB module to communicate with at least one third UWB module in the target scene; 获取所述第三UWB模块相对于所述第二UWB模块的第二相对位置;Acquire a second relative position of the third UWB module relative to the second UWB module; 控制第二AR显示模块显示所述目标场景的第二目标图像,根据所述第二相对位置确定所述第二目标图像中所述第三UWB模块的第二目标位置,确定所述第二目标位置的第二目标对象;Controlling the second AR display module to display a second target image of the target scene, determining a second target position of the third UWB module in the second target image according to the second relative position, and determining a second target object at the second target position; 获取所述第三UWB模块的第三目标设备信息,发送所述第三目标设备信息至目标服务器;Acquire the third target device information of the third UWB module, and send the third target device information to the target server; 根据所述第三目标身份信息向目标服务器发送第二身份请求指令,从所述目标服务器获取所述第三目标设备信息对应的第三目标身份信息;Sending a second identity request instruction to a target server according to the third target identity information, and obtaining the third target identity information corresponding to the third target device information from the target server; 控制所述第二AR显示模块在所述第二目标图像的所述第二目标位置显示所述第三目标身份信息;Controlling the second AR display module to display the third target identity information at the second target position of the second target image; 从所述目标服务器接收第二交互指令,控制所述第二AR显示模块在所述第二目标图像的所述第二目标对象显示第二目标交互效果。A second interaction instruction is received from the target server, and the second AR display module is controlled to display a second target interaction effect on the second target object in the second target image.  计算机装置,包括存储器与处理器,所述处理器存储有计算机程序,其特征在于:A computer device, comprising a memory and a processor, wherein the processor stores a computer program, characterized in that: 所述处理器执行所述存储器中的所述计算机程序时,实现上述权利要求1至6任一项所述的演出现场的交互方法。When the processor executes the computer program in the memory, the interactive method for a performance site as described in any one of claims 1 to 6 is implemented.  计算机可读存储介质,存储有计算机程序,其特征在于:A computer-readable storage medium stores a computer program, characterized in that: 所述计算机程序被处理器执行时,实现上述权利要求1至6任一项所述的演出现场的交互方法。When the computer program is executed by a processor, the interactive method for a performance site as described in any one of claims 1 to 6 is implemented.
PCT/CN2023/121986 2023-04-28 2023-09-27 Interaction method for performance site, computer apparatus, and computer-readable storage medium WO2024221739A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202310482989.3A CN116560504A (en) 2023-04-28 2023-04-28 Interactive method, computer device and computer readable storage medium for performance site
CN202310482989.3 2023-04-28

Publications (1)

Publication Number Publication Date
WO2024221739A1 true WO2024221739A1 (en) 2024-10-31

Family

ID=87495825

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/121986 WO2024221739A1 (en) 2023-04-28 2023-09-27 Interaction method for performance site, computer apparatus, and computer-readable storage medium

Country Status (2)

Country Link
CN (1) CN116560504A (en)
WO (1) WO2024221739A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117075720A (en) * 2023-04-28 2023-11-17 张仲元 Information interaction method, computer device and computer readable storage medium
CN116560504A (en) * 2023-04-28 2023-08-08 张仲元 Interactive method, computer device and computer readable storage medium for performance site
CN117369633B (en) * 2023-10-07 2024-08-23 九转棱镜(北京)科技有限公司 AR-based information interaction method and system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105138135A (en) * 2015-09-15 2015-12-09 北京国承万通信息科技有限公司 Head-mounted type virtual reality device and virtual reality system
CN105892650A (en) * 2016-03-28 2016-08-24 联想(北京)有限公司 Information processing method and electronic equipment
CN107682729A (en) * 2017-09-08 2018-02-09 广州华多网络科技有限公司 It is a kind of based on live interactive approach and live broadcast system, electronic equipment
US20180089870A1 (en) * 2016-09-26 2018-03-29 Rockwell Automation Technologies, Inc. Augmented reality presentation of an industrial environment
CN113709537A (en) * 2020-05-21 2021-11-26 云米互联科技(广东)有限公司 User interaction method based on 5G television, 5G television and readable storage medium
CN114820992A (en) * 2021-01-28 2022-07-29 腾讯科技(深圳)有限公司 Identity information display method, device, terminal, server and storage medium
CN116560504A (en) * 2023-04-28 2023-08-08 张仲元 Interactive method, computer device and computer readable storage medium for performance site

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105138135A (en) * 2015-09-15 2015-12-09 北京国承万通信息科技有限公司 Head-mounted type virtual reality device and virtual reality system
CN105892650A (en) * 2016-03-28 2016-08-24 联想(北京)有限公司 Information processing method and electronic equipment
US20180089870A1 (en) * 2016-09-26 2018-03-29 Rockwell Automation Technologies, Inc. Augmented reality presentation of an industrial environment
CN107682729A (en) * 2017-09-08 2018-02-09 广州华多网络科技有限公司 It is a kind of based on live interactive approach and live broadcast system, electronic equipment
CN113709537A (en) * 2020-05-21 2021-11-26 云米互联科技(广东)有限公司 User interaction method based on 5G television, 5G television and readable storage medium
CN114820992A (en) * 2021-01-28 2022-07-29 腾讯科技(深圳)有限公司 Identity information display method, device, terminal, server and storage medium
CN116560504A (en) * 2023-04-28 2023-08-08 张仲元 Interactive method, computer device and computer readable storage medium for performance site

Also Published As

Publication number Publication date
CN116560504A (en) 2023-08-08

Similar Documents

Publication Publication Date Title
WO2024221739A1 (en) Interaction method for performance site, computer apparatus, and computer-readable storage medium
US11895426B2 (en) Method and apparatus for capturing video, electronic device and computer-readable storage medium
US20170195650A1 (en) Method and system for multi point same screen broadcast of video
US12088857B2 (en) Method and system for controlling interactive live streaming co-hosting, device, and medium
CN109586929B (en) Conference content transmission method and device, electronic equipment and storage medium
CN112073754B (en) Cloud game screen projection method and device, computer equipment, computer readable storage medium and cloud game screen projection interaction system
JP6564884B2 (en) Multimedia information reproducing method and system, standardized server and live streaming terminal
CN112312226A (en) Wheat connecting method, system, device, electronic equipment and storage medium
CN113986177A (en) Screen projection method, screen projection device, storage medium and electronic equipment
US20230297324A1 (en) Audio Control Method, System, and Electronic Device
CN114095671A (en) Cloud conference live broadcast system, method, device, equipment and medium
CN114125358A (en) Cloud conference subtitle display method, system, device, electronic equipment and storage medium
WO2023143217A1 (en) Special effect prop display method, apparatus, device, and storage medium
CN110928509B (en) Display control method, display control device, storage medium, and communication terminal
US11683442B2 (en) Methods, systems and apparatus for providing video communications
US20220394325A1 (en) Lyric video display method and device, electronic apparatus and computer-readable medium
US20220174098A1 (en) Methods and apparatus for performing virtual relocation during a network conference
CN115396684B (en) Wheat connecting display method and device, electronic equipment and computer readable medium
CN115086729B (en) Wheat connecting display method and device, electronic equipment and computer readable medium
WO2023109671A1 (en) Live broadcast information processing method and apparatus, and device and storage medium
JP7245350B2 (en) Anchor sharing method and device, system, electronic device and storage medium
CN108092966A (en) Project content transmission method, device, readable storage medium storing program for executing and projector equipment
CN115426514A (en) Cross-device audio and video synchronization method, device, equipment and medium
CN113709652B (en) Audio play control method and electronic equipment
WO2025026354A1 (en) Audio processing method and apparatus, electronic device, and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23934908

Country of ref document: EP

Kind code of ref document: A1