CN111628925A - Song interaction method and device, terminal and storage medium - Google Patents
Song interaction method and device, terminal and storage medium Download PDFInfo
- Publication number
- CN111628925A CN111628925A CN202010450193.6A CN202010450193A CN111628925A CN 111628925 A CN111628925 A CN 111628925A CN 202010450193 A CN202010450193 A CN 202010450193A CN 111628925 A CN111628925 A CN 111628925A
- Authority
- CN
- China
- Prior art keywords
- interactive
- interaction
- target
- song
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 302
- 238000000034 method Methods 0.000 title claims abstract description 54
- 230000002452 interceptive effect Effects 0.000 claims abstract description 300
- 230000004044 response Effects 0.000 claims description 15
- 238000005516 engineering process Methods 0.000 abstract description 7
- 230000001360 synchronised effect Effects 0.000 abstract description 4
- 230000006870 function Effects 0.000 description 17
- 238000012545 processing Methods 0.000 description 14
- 238000010586 diagram Methods 0.000 description 10
- 230000001133 acceleration Effects 0.000 description 9
- 238000004891 communication Methods 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 7
- 230000002093 peripheral effect Effects 0.000 description 7
- 230000001960 triggered effect Effects 0.000 description 5
- 238000013473 artificial intelligence Methods 0.000 description 2
- 239000000919 ceramic Substances 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/04—Real-time or near real-time messaging, e.g. instant messaging [IM]
- H04L51/046—Interoperability with other network applications or services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/02—Details
- H04L12/16—Arrangements for providing special services to substations
- H04L12/18—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
- H04L12/185—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast with management of multicast group membership
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/07—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
- H04L51/10—Multimedia information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/07—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
- H04L51/18—Commands or executable codes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/52—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Computing Systems (AREA)
- Multimedia (AREA)
- Information Transfer Between Computers (AREA)
Abstract
The application discloses a song interaction method, a song interaction device, a song interaction terminal and a song interaction storage medium, and belongs to the technical field of networks. The method comprises the following steps: displaying a song playing interface, wherein the song playing interface comprises at least one candidate interaction inlet, and different candidate interaction inlets correspond to different interaction groups; receiving a trigger operation of a target interaction inlet in at least one candidate interaction inlet, wherein the target interaction inlet corresponds to a target interaction group; and displaying an interactive interface corresponding to the target interactive group, wherein the interactive interface is used for displaying real-time interactive information sent by the interactive objects in the target interactive group. By the song interaction method provided by the embodiment of the application, the terminal user receives the real-time interaction information sent by the interaction user through the interaction interface and carries out the interaction of the synchronous information, the problem that the terminal user cannot carry out synchronous information interaction when playing the song in the related technology is solved, and the interactivity between the terminal user and the interaction object when listening to the song is improved.
Description
Technical Field
The embodiment of the application relates to the technical field of networks, in particular to a song interaction method, a song interaction device, a song interaction terminal and a song interaction storage medium.
Background
With the rapid development of internet technology, online interaction modes are increasing. If so, a popup screen can be sent for interaction when watching the video; when the song is played, the user can enter the comment area to leave a message and the like.
For song comments, when a target song is played on a terminal, a comment entry for posting comments on the target song is displayed. In the related art, the comments of the target song are in an asynchronous interaction mode, and real-time communication among a plurality of users cannot be realized. Particularly, for users who are playing songs, real-time interaction is more important, and good real-time interaction can create an ideal social environment for the users and promote interaction among the users.
Disclosure of Invention
The embodiment of the application provides a song interaction method, a song interaction device, a song interaction terminal and a song interaction storage medium, and the technical scheme is as follows:
in one aspect, a song interaction method is provided, and the method includes:
displaying a song playing interface, wherein the song playing interface comprises at least one candidate interaction inlet, and different candidate interaction inlets correspond to different interaction groups;
receiving a trigger operation of a target interaction inlet in at least one candidate interaction inlet, wherein the target interaction inlet corresponds to a target interaction group;
and displaying an interactive interface corresponding to the target interactive group, wherein the interactive interface is used for displaying real-time interactive information sent by the interactive objects in the target interactive group.
In another aspect, a song interaction apparatus is provided, the apparatus comprising:
the song playing interface comprises at least one candidate interaction entrance, and different candidate interaction entrances correspond to different interaction groups;
a trigger operation receiving module, configured to receive a trigger operation on a target interaction entry in at least one candidate interaction entry, where the target interaction entry corresponds to a target interaction group;
and the interactive interface display module is used for displaying an interactive interface corresponding to the target interactive group, and the interactive interface is used for displaying real-time interactive information sent by the interactive objects in the target interactive group.
In another aspect, a terminal is provided that includes a processor and a memory; the memory stores at least one instruction for execution by the processor to implement the song interaction method of the above aspect.
In another aspect, a computer-readable storage medium is provided that stores at least one instruction for execution by a processor to implement a song interaction method as described in the above aspect.
In another aspect, a computer program product is provided, which stores at least one instruction, and the at least one instruction is loaded and executed by the processor to implement the song interaction method of the above aspect.
In the embodiment of the application, a song interaction method is provided, which is different from a song playing interface in the related technology, the song playing interface in the embodiment of the application comprises at least one candidate interaction inlet, when a trigger operation on a target interaction inlet in the at least one candidate interaction inlet is received, a terminal interface displays an interaction interface corresponding to a target interaction group, then a terminal user receives real-time interaction information sent by the interaction user through the interaction interface and carries out interaction of synchronous information, the problem that the terminal user cannot carry out synchronous information interaction when playing songs in the related technology is solved, and the interactivity between the terminal user and a corresponding interaction object when listening to songs is improved.
Drawings
FIG. 1 illustrates a schematic diagram of an implementation environment provided by an exemplary embodiment of the present application;
FIG. 2 illustrates a flow chart of a song interaction method provided by an exemplary embodiment of the present application;
FIG. 3 illustrates an interface diagram of an incoming interaction interface provided by an exemplary embodiment of the present application;
FIG. 4 illustrates a flow chart of a song interaction method provided by another exemplary embodiment of the present application;
FIG. 5 is an interface diagram illustrating an interaction group creation process provided by an exemplary embodiment of the present application;
FIG. 6 illustrates a flow chart of a song interaction method provided by another exemplary embodiment of the present application;
FIG. 7 illustrates an interactive interface diagram provided by an exemplary embodiment of the present application;
FIG. 8 is a block diagram illustrating the structure of a song interaction device provided by an exemplary embodiment of the present application;
fig. 9 is a block diagram illustrating a structure of a terminal according to an exemplary embodiment of the present application;
fig. 10 is a block diagram illustrating a server according to an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Reference herein to "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
FIG. 1 is a schematic illustration of an implementation environment provided by an exemplary embodiment of the present application. Referring to fig. 1, the implementation environment may include: a first terminal 100, a server 200 and a second terminal 300.
It should be noted that the first terminal 100 corresponds to an end user in this embodiment, the second terminal 300 corresponds to a terminal device of an interactive object of the end user, and the number of the second terminals 300 is at least one. In which clients having the same music playing software are installed and run in the first terminal 100 and the second terminal 300.
The first terminal 100 is connected to the server 200 through a wireless network or a wired network.
The server 200 is a transfer station for transmitting the interactive information between the terminal user and the interactive object thereof, and is configured to receive the real-time interactive information from the first terminal 100 and forward the real-time interactive information to the second terminal 300, so that the user of the second terminal 300 can see the real-time interactive information sent by the user corresponding to the first terminal 100 on the interactive interface.
Optionally, the server 200 may be an independent server, or may be integrated into any one of a server cluster, a virtual cloud storage, or a cloud computing center, taking the server 200 as the server cluster as an example, in this embodiment of the present application, the server 200 may include an interaction server 210 and a comment server 220, where the interaction server 210 and the comment server 220 are connected through a wireless network or a wired network. On the side of the first terminal 100, the first terminal 100 is connected with the interaction server 210 and the comment server 220 through a wireless network or a wired network, respectively; on the side of the second terminal 300, the second terminal 300 is connected to the interaction server 210 and the comment server 220 through a wireless network or a wired network, respectively.
In conjunction with the above-mentioned components of the server 200, in the embodiment of the present application, the interaction server 210 may include the following functions: when receiving an interactive group creation operation, the first terminal 100 sends an interactive group creation request to the interactive server 210, where the interactive server 210 is configured to create a corresponding interactive entry for the interactive group; when the first terminal 100 receives the interactive information sending instruction, sending the input target real-time interactive information to the interactive server 210, where the interactive server 210 is configured to forward the target real-time interactive information to other interactive objects in the target interactive group (i.e., terminal users corresponding to the second terminals 300); when the first terminal 100 receives a closing operation of the interactive interface, an interactive quitting instruction is sent to the interactive server 210, and the interactive server 210 is configured to remove the current interactive object from the target interactive group according to the interactive quitting instruction.
Further, in the embodiment of the present application, the comment server 220 may include the following functions: when the first terminal 100 receives a comment publishing operation on the target real-time interaction information, the target real-time interaction information is sent to the comment server 220, and the comment server 220 is used for publishing the target real-time interaction information to a comment area corresponding to the target song.
Optionally, the server 200 may further include a user identifier management server, where the user identifier management server is configured to manage events such as user identifier creation and user identifier supervision during information interaction; optionally, the server 200 may further include an interactive interface management server, where the interactive interface management server is configured to manage interactive interface identifier creation, interactive interface identifier supervision, and other events. The server 200 may include a server only for illustrative example, and may be supplemented according to the content of the embodiment of the present application, which is not limited in this embodiment of the present application.
In addition, when the server 200 is an independent server, the server 200 has the background service function of each server.
In the related art, when each terminal user listens to a song, a song playing interface is displayed on an application program interface of the terminal, and the song playing interface is provided with a comment entry capable of displaying a published comment. In one example, a user A is playing a song 1, the user A enters a comment area through a trigger operation on a comment entry, and comments previously published by other terminal users are displayed in the comment area; further, the user a may publish an independent comment in the comment area, or reply to comments published by other end users.
In the above example, that is, in the comment interaction method provided in the related art, the terminal users listening to the same song can only perform asynchronous interaction, and if the terminal users are comment publishers, comment interaction between the terminal users can be realized only under the condition that other terminal users need to wait for entering the comment area again and being able to browse and reply the comment, which takes a long time; if the terminal user is a comment replying person, the terminal user still needs to wait for a long time to receive the reply of the commented user, and the like, and repeated interaction is carried out.
The embodiment of the application provides a song interaction method, which can solve the problem that different terminal users cannot realize instant interaction during song review in the related art.
Referring to fig. 2, a flowchart of a song interaction method provided by an exemplary embodiment of the present application is shown. The method is suitable for the implementation environment shown in FIG. 1, and comprises the following steps:
And the song playing interface comprises at least one candidate interaction entrance. In the above example, the song playing interface in the related art includes a comment entry, and it should be noted that the comment entry is not the same as the candidate interaction entry in the embodiments of the present application.
In the embodiment of the application, different candidate interaction portals correspond to different interaction groups. Optionally, if the song playing interface includes a candidate interaction entry, the interaction group corresponding to the candidate interaction entry may be an interaction group uniquely corresponding to the target song, and in the interaction group, each terminal user is a user listening to the target song at the same time; if the song playing interface comprises at least two candidate interaction entries, the interaction group corresponding to the candidate interaction entry is an interaction group related to the target song, for example, each interaction group is associated with a plurality of songs, and the plurality of songs included in each interaction group together comprise the target song displayed by the song playing interface.
And responding to the song playing interface comprising at least two candidate interaction inlets, the user can select a plurality of current candidate interaction inlets, and the client receives the triggering operation of the terminal user on a target interaction inlet in at least one candidate interaction inlet, so that the selection of the target interaction inlet by the user is completed. And the target interaction entrance corresponds to the target interaction group.
In one example, as shown in FIG. 3, terminal interface 300 is displayed with a song play interface, along with entry control 301. When the end user triggers the entry control 301, there are two possible interface display scenarios. If the only interactive interface corresponding to the target song can be directly accessed after the entry control 301 is triggered, the current candidate interactive entry is one, and an interactive interface displayed as a terminal interface 320 is displayed, wherein the interactive interface displays a chat area named as a Zz exclusive chat room 1; if the interface displays the interactive interface displayed on the terminal interface 310 after the entry control 301 is triggered, it indicates that more than one current candidate interactive entries need to be further selected by the terminal user, as shown in fig. 3, the terminal interface 310 displays three candidate interactive entries, namely, icons 311 to 313, and when the terminal user selects the icon 311, the interactive interface displayed on the terminal interface 320 can be accessed. Namely, the terminal user can select a certain candidate interaction entry as a target interaction entry according to own preference and interaction requirements, and then the client receives the triggering operation of the terminal user on the target interaction entry in the three candidate interaction entries.
And 203, displaying an interactive interface corresponding to the target interactive group, wherein the interactive interface is used for displaying real-time interactive information sent by the interactive objects in the target interactive group.
Correspondingly, in response to the client receiving the triggering operation of the terminal user on the target interaction entry in the at least one candidate interaction entry, the terminal interface corresponding to the client displays the interaction interface corresponding to the target interaction group.
The interactive interface is used for displaying real-time interactive information sent by the interactive objects in the target interactive group. The interactive object is other terminal users which are different from the terminal user, and the terminal user and the interactive object are on-line synchronously, so that the terminal user can synchronously receive real-time interactive information sent by the interactive object in an interactive interface; further, a terminal user may send real-time interactive information on the interactive interface, such as sending real-time interactive information, such as sharing links, characters, pictures, or voices, and the specific form of the real-time interactive information is not limited in the embodiment of the present application.
Schematically, as shown in fig. 3, the interactive interface displayed by the terminal interface 320 is an interactive interface corresponding to a target interactive group, where the target interactive group is an interactive object of the terminal user, and in fig. 3, the terminal user may further perform viewing of all interactive objects by triggering the interactive object viewing control 321, and perform other functions such as friend addition and private letter of the interactive object.
In summary, in the embodiment of the present application, a song interaction method is provided, which is different from a song playing interface in the related art, in which a song playing boundary in the embodiment of the present application includes at least one candidate interaction entry, when a trigger operation on a target interaction entry in the at least one candidate interaction entry is received, a terminal interface displays an interaction interface corresponding to a target interaction group, and then a terminal user receives real-time interaction information sent by an interaction user through the interaction interface and performs interaction of synchronization information, so that a problem that the terminal user cannot perform information synchronization interaction when playing a song in the related art is solved, and interactivity between the terminal user and a corresponding interaction object when listening to the song is improved.
It has been proposed in the above embodiments that the candidate interaction portal may be an interaction portal already existing for playing a song, or an interaction portal created by the end user, and the following embodiments are explained for the case where the end user creates an interaction portal.
Referring to fig. 4, a flowchart of a song interaction method provided by another exemplary embodiment of the present application is shown. The method is suitable for the implementation environment shown in FIG. 1, and comprises the following steps:
In a possible implementation manner, when the end user wants to create the interaction group corresponding to the target song by himself, the interaction group can be implemented by triggering the relevant control of the client interface. And if the user clicks a control used for indicating the creation of the interactive group to generate interactive group creation operation, the client sends an interactive group creation request to the interactive server after receiving the interactive group creation operation.
The interactive group creating request comprises a group name and interactive songs corresponding to the interactive group, and the interactive server is used for creating a corresponding interactive entrance for the interactive group. Optionally, the interactive song may be a set of songs with similar characteristics, such as having similar characteristics of the same word author, the same music style, and the like.
As shown in fig. 5, an interactive interface displayed as a terminal interface 500 is displayed, the interactive interface displays a chat area named as "Zz exclusive chat room 1", and also displays a hall jump control 501, when a terminal user triggers the hall jump control 501, the current interactive interface jumps to a hall interface displayed by the terminal interface 510, optionally, the hall interface displays a "recommended room" function, and realizes a function of selecting a chat room according to a type according to "recommended category"; further, a room creation control 511 is displayed on the terminal interface 510, when a terminal user triggers the room creation control 511, an interactive group creation operation is generated, the terminal sends an interactive group creation request to the interactive server in response to receiving the interactive group creation operation, at this time, the current interactive interface jumps to the room creation interface displayed on the terminal interface 520, and optionally, the room creation interface displays a "room name" filling box, a "room classification" filling box, and an "associated song" filling box; when the end user completes the input according to the prompt of the filling box, the confirmation creation control 521 can be triggered to perform confirmation.
In one possible implementation, in response to a song playing request of a target song, the client sends an interaction portal acquisition request to the interaction server. Wherein the song play request is triggered by the end user.
In one example, the song interaction method provided by the application is implemented as a new function of song playing software, when a terminal user triggers a song playing control on a song playing interface of a target song, a song playing request is generated, a client sends an interaction entry acquisition request to an interaction server, and when the interaction server agrees, the terminal user can see that at least one candidate interaction entry is displayed on the song playing interface when the target song is played.
Optionally, step 402 includes the following content one and content two.
And content I, responding to a song playing request, and acquiring song information of a target song, wherein the song information comprises at least one of a song name, an artist, a song style, an affiliated album and an affiliated year.
And secondly, sending an interactive entrance acquisition request to the interactive server according to the song information.
The interactive songs corresponding to the candidate interactive entries are target songs, or the interactive songs corresponding to the candidate interactive entries and the target songs correspond to the same singer, or the interactive songs corresponding to the candidate interactive entries and the target songs belong to the same song style, or the interactive songs corresponding to the candidate interactive entries and the target songs belong to the same album, or the interactive songs corresponding to the candidate interactive entries and the target songs belong to the same year.
And step 403, displaying at least one candidate interaction entrance on the song playing interface according to the candidate interaction entrance information sent by the interaction server.
Please refer to step 202, which is not described herein again in this embodiment.
And 405, displaying an interactive interface corresponding to the target interactive group, wherein the interactive interface is used for displaying real-time interactive information sent by the interactive objects in the target interactive group.
Please refer to step 203, which is not described herein again in this embodiment.
On the basis of the above embodiment, the embodiment of the present application further discloses that the client can send the content of the interactive group creation request to the interactive server, so as to improve the participation of the terminal user in the interactive group and enrich the creation mode of the interactive group; in addition, the function of automatically displaying the candidate interaction entries on the song playing interface when the target song is played is further disclosed, and the user using portability of the function is provided.
Referring to fig. 6, a flowchart of a song interaction method provided by another exemplary embodiment of the present application is shown. The method is suitable for the implementation environment shown in FIG. 1, and comprises the following steps:
Please refer to step 401, which is not described herein again in this embodiment.
Please refer to step 402, which is not described herein again in this embodiment.
Please refer to step 403, which is not described herein again in this embodiment.
Please refer to step 404, which is not described herein again in this embodiment.
In the embodiment of the application, the geographical position interaction between the terminal user and the interactive user can be realized. The first position information is the position information of the current geographical position.
And 606, displaying an interactive interface corresponding to the target interactive group, wherein the interactive interface is used for displaying real-time interactive information sent by the interactive objects in the target interactive group.
Please refer to step 405, which is not described herein again in this embodiment.
Step 607, receiving the real-time interaction information sent by the target interaction object and the second location information of the target interaction object.
In order to realize the geographical position interaction function between the terminal user and the interactive user, the client receives the real-time interaction information sent by the target interactive object and also receives the second position information of the target interactive object on the basis of receiving the first position information.
And the real-time interactive information and the second position information are forwarded by the interactive server. The target interactive object is any end user in the interactive object.
And step 608, displaying the real-time interaction information, the geographic position and the distance information on the interaction interface.
Alternatively, the distance information may be determined by the interaction server, or by the client providing the first location information. The geographic position is determined according to the second position information, and the distance information is determined according to the first position information and the second position information.
In one example, as shown in fig. 7, an interactive interface as shown in a terminal interface 700 is displayed, the interactive interface displays a chat area named as "Zz-specific chat room 1", and when the target interactive object sends the real-time interactive information, the interactive interface can be displayed according to the display format shown in an example block 701. And displaying the icon of the head portrait, the nickname, the gender, the geographic position and the distance information of the target interactive object and the speaking content of the user (namely the real-time interactive information sent by the target interactive object) on the interactive interface. If the text prompt of 'first primary school, distance 2.3 kilometers' is displayed on the periphery of the icon of the head portrait of the target interactive object, wherein the 'first primary school' is the geographic position of the target interactive object, and the '2.3 kilometers' is distance information.
It should be noted that the steps 601 to 608 can be implemented as one embodiment.
Optionally, step 609 is further included after step 606.
The target real-time interactive information comprises at least one of character information, picture information and audio and video information, and the interactive server is used for forwarding the target real-time interactive information to other interactive objects in the target interactive group.
In an example, as shown in fig. 5, in an interaction interface corresponding to a terminal interface 500, a terminal user inputs text information through a dialog input box 502, and triggers a send button 503, so that a client receives an interaction information send instruction, sends the input target real-time interaction information to an interaction server, and forwards the text information to other interaction objects in a target interaction group through the interaction server.
In another example, as shown in fig. 5, a terminal user displays a terminal interface 530 by triggering an interaction selection control 504 in an interaction interface corresponding to the terminal interface 500, where the terminal interface 530 includes an interaction option selection box 531, and if the terminal user selects a voice option, an interaction information sending instruction is triggered and generated, the client receives the interaction information sending instruction, the interaction server generates a phone request according to the interaction information sending instruction and sends the phone request to other interaction objects in the target interaction group, and when an interaction object for confirming the phone request exists in the other interaction objects in the target interaction group, the interaction server establishes an instant communication channel for the terminal user and the interaction object, and implements an online phone function.
Optionally, if the target real-time interaction information is displayed in the interaction interface, step 610 is further included after step 606.
Step 610, in response to receiving the comment publishing operation on the target real-time interaction information, sending the target real-time interaction information to a comment server, where the comment server is used for publishing the target real-time interaction information to a comment area corresponding to the target song.
In one example, a terminal user sends own speech (namely target real-time interaction information) as a comment to a comment area corresponding to a target song, the realization process of the comment area comprises the steps that the terminal user determines the target real-time interaction information, the target real-time interaction information is sent to a comment server by triggering a publishing control, and then the comment server publishes the target real-time interaction information to the comment area corresponding to the target song.
Optionally, after step 606, step 611 is further included.
Step 611, in response to receiving the closing operation of the interactive interface, sending an interactive quit instruction to the interactive server.
At this time, the interaction server is used for removing the current interaction object from the target interaction group according to the interaction quit instruction.
Optionally, after step 606, step 612 is further included.
Step 612, a group reservation operation for the target interaction group is received.
In a possible implementation manner, after the terminal user listens to the target song, the terminal user cannot continue to stay on the interactive interface due to a certain event, and at the moment, the terminal user triggers a group reservation operation on the target interactive group, so that the terminal user can continue to enter the target interactive group when listening to the song next time.
Optionally, step 613 is further included after step 606.
Step 613, in response to receiving the closing operation of the interactive interface, displaying a target interaction entry corresponding to the target interaction group in a predetermined area of the user interface.
In a possible implementation manner, after the terminal user listens to the target song, a group reservation operation for the target interaction group is not required, and when the client receives a closing operation for the interaction interface, a target interaction entry corresponding to the target interaction group is displayed in a predetermined area of the user interface, so that the terminal user can continue to enter the interaction interface again through the target interaction entry when exiting carelessly.
On the basis of the above embodiment, a song interaction method is further provided, which is different from a song playing interface in the related art, the song playing interface in the embodiment of the application includes at least one candidate interaction entry, when a trigger operation on a target interaction entry in the at least one candidate interaction entry is received, a terminal interface displays an interaction interface corresponding to a target interaction group, then a terminal user receives real-time interaction information sent by an interaction user through the interaction interface and performs interaction of synchronization information, the problem that the terminal user cannot perform information synchronization interaction when playing a song in the related art is solved, and interactivity between the terminal user and a corresponding interaction object when listening to the song is improved; the client can send the content of the interactive group creating request to the interactive server so as to improve the participation of the terminal user in the interactive group and enrich the creating mode of the interactive group; in addition, the method also discloses a function of automatically displaying the candidate interaction entry on a song playing interface when the target song is played according to the acquisition of the candidate interaction entry according to the song playing request of the target song, thereby providing the user using portability of the function; in the interactive interface, the terminal user can also share the geographical position with the interactive object, so that the interactive experience of the user when listening to the song is further improved.
Referring to fig. 8, a block diagram of a song interaction apparatus provided in an exemplary embodiment of the present application is shown, the apparatus including:
a playing interface display module 801, configured to display a song playing interface, where the song playing interface includes at least one candidate interaction portal, and different candidate interaction portals correspond to different interaction groups;
a trigger operation receiving module 802, configured to receive a trigger operation on a target interaction portal in at least one candidate interaction portal, where the target interaction portal corresponds to a target interaction group;
and an interactive interface display module 803, configured to display an interactive interface corresponding to the target interactive group, where the interactive interface is used to display real-time interactive information sent by an interactive object in the target interactive group.
Optionally, the playing interface display module 801 includes:
the first display unit is used for responding to a song playing request of a target song and sending an interactive entrance acquisition request to the interactive server;
and the second display unit is used for displaying at least one candidate interaction entrance on the song playing interface according to the candidate interaction entrance information sent by the interaction server, wherein the interaction song corresponding to the candidate interaction entrance has a preset relationship with the target song.
Optionally, the first display unit is further configured to:
acquiring song information of the target song in response to the song playing request, wherein the song information comprises at least one of a song name, an artist, a song style, an affiliated album and an affiliated age;
sending the interactive entrance acquisition request to the interactive server according to the song information;
the interactive songs corresponding to the candidate interactive entrances are the target songs, or the interactive songs corresponding to the candidate interactive entrances and the target songs correspond to the same singer, or the interactive songs corresponding to the candidate interactive entrances and the target songs belong to the same song style, or the interactive songs corresponding to the candidate interactive entrances and the target songs belong to the same album, or the interactive songs corresponding to the candidate interactive entrances and the target songs belong to the same year.
Optionally, the apparatus further comprises:
the interactive group creation module is used for responding to the received interactive group creation operation and sending an interactive group creation request to the interactive server, wherein the interactive group creation request comprises a group name and an interactive song corresponding to the interactive group, and the interactive server is used for creating a corresponding interactive entrance for the interactive group.
Optionally, the apparatus further comprises:
the information reporting module is used for responding to the triggering operation of the target interaction entrance and reporting first position information to the interaction server, wherein the first position information is the position information of the current geographical position;
optionally, the apparatus further comprises:
the information receiving module is used for receiving the real-time interaction information sent by the target interaction object and second position information of the target interaction object, the second position information is reported by the target interaction object, and the real-time interaction information and the second position information are forwarded by the interaction server;
and the information display module is used for displaying the real-time interaction information, the geographic position and the distance information on the interaction interface, wherein the geographic position is determined according to the second position information, and the distance information is determined according to the first position information and the second position information.
Optionally, the apparatus further comprises:
the first information sending module is used for responding to a received interactive information sending instruction and sending input target real-time interactive information to an interactive server, wherein the target real-time interactive information comprises at least one of character information, picture information and audio and video information, and the interactive server is used for forwarding the target real-time interactive information to other interactive objects in the target interactive group.
Optionally, the apparatus further comprises:
and the second information sending module is used for responding to the received comment publishing operation of the target real-time interactive information and sending the target real-time interactive information to a comment server, and the comment server is used for publishing the target real-time interactive information to a comment area corresponding to a target song.
Optionally, the apparatus further comprises:
and the instruction sending module is used for responding to the received closing operation of the interactive interface and sending an interactive quitting instruction to an interactive server, and the interactive server is used for removing the current interactive object from the target interactive group according to the interactive quitting instruction.
Optionally, the apparatus further comprises:
an operation reservation module, configured to receive a group reservation operation for the target interaction group;
and the entrance display module is used for responding to the received closing operation of the interactive interface and displaying the target interactive entrance corresponding to the target interactive group in a preset area of a user interface.
Referring to fig. 9, a block diagram of a terminal 900 according to an exemplary embodiment of the present application is shown. The terminal 900 may be a portable mobile terminal such as: smart phones, tablet computers, MP3 players (Moving picture Experts Group Audio Layer III, mpeg Audio Layer IV), MP4 players (Moving picture Experts Group Audio Layer IV, mpeg Audio Layer 4). Terminal 900 may also be referred to by other names such as user equipment, portable terminal, etc. Alternatively, the terminal 900 may be a terminal device corresponding to the plug flow end 100 shown in fig. 1.
In general, terminal 900 includes: a processor 901 and a memory 902.
In some embodiments, terminal 900 can also optionally include: a peripheral interface 903 and at least one peripheral. Specifically, the peripheral device includes: at least one of a radio frequency circuit 904, a touch display screen 905, a camera 906, an audio circuit 907, a positioning component 908, and a power supply 909.
The peripheral interface 903 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 901 and the memory 902. In some embodiments, the processor 901, memory 902, and peripheral interface 903 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 901, the memory 902 and the peripheral interface 903 may be implemented on a separate chip or circuit board, which is not limited by this embodiment.
The Radio Frequency circuit 904 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 904 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 904 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 904 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 904 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 904 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The touch display screen 905 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. The touch display screen 905 also has the ability to capture touch signals on or over the surface of the touch display screen 905. The touch signal may be input to the processor 901 as a control signal for processing. The touch display 905 is used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the touch display 905 may be one, providing the front panel of the terminal 900; in other embodiments, the number of the touch display screens 905 may be at least two, and the touch display screens are respectively disposed on different surfaces of the terminal 900 or in a folding design; in still other embodiments, the touch display 905 may be a flexible display disposed on a curved surface or on a folded surface of the terminal 900. Even more, the touch display screen 905 can be arranged in a non-rectangular irregular figure, i.e. a shaped screen. The touch Display panel 905 can be made of LCD (liquid crystal Display), OLED (Organic Light-Emitting Diode), and the like.
The camera assembly 906 is used to capture images or video. Optionally, camera assembly 906 includes a front camera and a rear camera. Generally, a front camera is used for realizing video call or self-shooting, and a rear camera is used for realizing shooting of pictures or videos. In some embodiments, the number of the rear cameras is at least two, and each of the rear cameras is any one of a main camera, a depth-of-field camera and a wide-angle camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting function and a VR (Virtual Reality) shooting function. In some embodiments, camera assembly 906 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The positioning component 908 is used to locate the current geographic location of the terminal 900 to implement navigation or LBS (location based Service). The positioning component 908 may be a positioning component based on the GPS (global positioning System) in the united states, the beidou System in china, or the galileo System in russia.
In some embodiments, terminal 900 can also include one or more sensors 910. The one or more sensors 910 include, but are not limited to: acceleration sensor 911, gyro sensor 912, pressure sensor 913, fingerprint sensor 914, optical sensor 915, and proximity sensor 916.
The acceleration sensor 911 can detect the magnitude of acceleration in three coordinate axes of the coordinate system established with the terminal 900. For example, the acceleration sensor 911 may be used to detect the components of the gravitational acceleration in three coordinate axes. The processor 901 can control the touch display 905 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 911. The acceleration sensor 911 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 912 may detect a body direction and a rotation angle of the terminal 900, and the gyro sensor 912 may cooperate with the acceleration sensor 911 to acquire a 3D motion of the user on the terminal 900. The processor 901 can implement the following functions according to the data collected by the gyro sensor 912: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 913 may be disposed on the side bezel of terminal 900 and/or underneath touch display 905. When the pressure sensor 913 is disposed at the side frame of the terminal 900, a user's grip signal to the terminal 900 may be detected, and left-right hand recognition or shortcut operation may be performed according to the grip signal. When the pressure sensor 913 is disposed at the lower layer of the touch display screen 905, the control of the operable control on the UI interface may be implemented according to the pressure operation of the user on the touch display screen 905. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 914 is used for collecting a fingerprint of a user to identify the identity of the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, processor 901 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 914 may be disposed on the front, back, or side of the terminal 900. When a physical key or vendor Logo is provided on the terminal 900, the fingerprint sensor 914 may be integrated with the physical key or vendor Logo.
The optical sensor 915 is used to collect ambient light intensity. In one embodiment, the processor 901 may control the display brightness of the touch display 905 based on the ambient light intensity collected by the optical sensor 915. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 905 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 905 is turned down. In another embodiment, the processor 901 can also dynamically adjust the shooting parameters of the camera assembly 906 according to the ambient light intensity collected by the optical sensor 915.
A proximity sensor 916, also known as a distance sensor, is typically disposed on the front face of terminal 900. The proximity sensor 916 is used to collect the distance between the user and the front face of the terminal 900. In one embodiment, when the proximity sensor 916 detects that the distance between the user and the front face of the terminal 900 gradually decreases, the processor 901 controls the touch display 905 to switch from the bright screen state to the dark screen state; when the proximity sensor 916 detects that the distance between the user and the front surface of the terminal 900 gradually becomes larger, the processor 901 controls the touch display 905 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 9 does not constitute a limitation of terminal 900, and may include more or fewer components than those shown, or may combine certain components, or may employ a different arrangement of components.
Referring to fig. 10, a schematic structural diagram of a server 1000 according to an embodiment of the present application is shown. The server 1000 may be used to implement the song interaction method provided in the above-described embodiments. The server 1000 may be the push streaming server 200 described in the embodiment of fig. 1. Specifically, the method comprises the following steps:
the server 1000 includes a Central Processing Unit (CPU)1001, a system memory 1004 including a Random Access Memory (RAM)1002 and a Read Only Memory (ROM)1003, and a system bus 1005 connecting the system memory 1004 and the central processing unit 1001. The server 1000 also includes a basic input/output system (I/O system) 1006, which facilitates the transfer of information between devices within the computer, and a mass storage device 1007, which stores an operating system 1013, application programs 1014, and other program modules 1015.
The basic input/output system 1006 includes a display 1008 for displaying information and an input device 1009, such as a mouse, keyboard, etc., for user input of information. Wherein the display 1008 and input device 1009 are connected to the central processing unit 1001 through an input-output controller 1010 connected to the system bus 1005. The basic input/output system 1006 may also include an input/output controller 1010 for receiving and processing input from a number of other devices, such as a keyboard, mouse, or electronic stylus. Similarly, the input-output controller 1010 also provides output to a display screen, a printer, or other type of output device.
The mass storage device 1007 is connected to the central processing unit 1001 through a mass storage controller (not shown) connected to the system bus 1005. The mass storage device 1007 and its associated computer-readable media provide non-volatile storage for the server 1000. That is, the mass storage device 1007 may include a computer readable medium (not shown) such as a hard disk or CD-ROM drive.
Without loss of generality, the computer-readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, DVD, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices. Of course, those skilled in the art will appreciate that the computer storage media is not limited to the foregoing. The system memory 1004 and mass storage device 1007 described above may be collectively referred to as memory.
The server 1000 may also operate as a remote computer connected to a network via a network, such as the internet, according to various embodiments of the present application. That is, the server 1000 may be connected to the network 1012 through the network interface unit 1011 connected to the system bus 1005, or the network interface unit 1011 may be used to connect to another type of network or a remote computer system (not shown).
The memory also includes one or more programs stored in the memory and configured to be executed by one or more processors. The one or more programs include instructions for implementing the server-side song interaction method.
The embodiment of the present application further provides a computer-readable storage medium, where at least one instruction is stored in the storage medium, and the at least one instruction is loaded and executed by a processor to implement the song interaction method provided in the above embodiments.
Optionally, the computer-readable storage medium may include: a Read Only Memory (ROM), a Random Access Memory (RAM), a Solid State Drive (SSD), or an optical disc. The Random Access Memory may include a resistive Random Access Memory (ReRAM) and a Dynamic Random Access Memory (DRAM).
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.
Claims (12)
1. A song interaction method, the method comprising:
displaying a song playing interface, wherein the song playing interface comprises at least one candidate interaction inlet, and different candidate interaction inlets correspond to different interaction groups;
receiving a trigger operation of a target interaction inlet in at least one candidate interaction inlet, wherein the target interaction inlet corresponds to a target interaction group;
and displaying an interactive interface corresponding to the target interactive group, wherein the interactive interface is used for displaying real-time interactive information sent by the interactive objects in the target interactive group.
2. The method of claim 1, wherein displaying the song playback interface comprises:
responding to a song playing request of a target song, and sending an interactive entrance acquisition request to an interactive server;
and displaying at least one candidate interaction entrance on the song playing interface according to the candidate interaction entrance information sent by the interaction server, wherein the interaction song corresponding to the candidate interaction entrance has a preset relationship with the target song.
3. The method of claim 2, wherein sending an interactive portal get request to an interactive server in response to a song play request for a target song comprises:
acquiring song information of the target song in response to the song playing request, wherein the song information comprises at least one of a song name, an artist, a song style, an affiliated album and an affiliated age;
sending the interactive entrance acquisition request to the interactive server according to the song information;
the interactive songs corresponding to the candidate interactive entrances are the target songs, or the interactive songs corresponding to the candidate interactive entrances and the target songs correspond to the same singer, or the interactive songs corresponding to the candidate interactive entrances and the target songs belong to the same song style, or the interactive songs corresponding to the candidate interactive entrances and the target songs belong to the same album, or the interactive songs corresponding to the candidate interactive entrances and the target songs belong to the same year.
4. The method of claim 2, further comprising:
and responding to the received interactive group creating operation, sending an interactive group creating request to the interactive server, wherein the interactive group creating request comprises a group name and an interactive song corresponding to the interactive group, and the interactive server is used for creating a corresponding interactive inlet for the interactive group.
5. The method according to any of claims 1 to 4, wherein after receiving the trigger action for the target interaction portal of the at least one candidate interaction portal, the method further comprises:
responding to the triggering operation of the target interaction entrance, and reporting first position information to an interaction server, wherein the first position information is the position information of the current geographical position;
after the interactive interface corresponding to the target interactive group is displayed, the method further includes:
receiving the real-time interaction information sent by a target interaction object and second position information of the target interaction object, wherein the second position information is reported by the target interaction object, and the real-time interaction information and the second position information are forwarded by the interaction server;
and displaying the real-time interaction information, the geographic position and the distance information on the interaction interface, wherein the geographic position is determined according to the second position information, and the distance information is determined according to the first position information and the second position information.
6. The method according to any one of claims 1 to 4, wherein after displaying the interactive interface corresponding to the target interaction group, the method further comprises:
and responding to a received interactive information sending instruction, sending input target real-time interactive information to an interactive server, wherein the target real-time interactive information comprises at least one of character information, picture information and audio and video information, and the interactive server is used for forwarding the target real-time interactive information to other interactive objects in the target interactive group.
7. The method of claim 6, wherein the target real-time interaction information is displayed in the interactive interface, and the method further comprises:
and responding to the received comment publishing operation of the target real-time interactive information, and sending the target real-time interactive information to a comment server, wherein the comment server is used for publishing the target real-time interactive information to a comment area corresponding to a target song.
8. The method according to any one of claims 1 to 4, wherein after displaying the interactive interface corresponding to the target interaction group, the method further comprises:
and responding to the received closing operation of the interactive interface, and sending an interactive quitting instruction to an interactive server, wherein the interactive server is used for removing the current interactive object from the target interactive group according to the interactive quitting instruction.
9. The method according to any one of claims 1 to 4, wherein after displaying the interactive interface corresponding to the target interaction group, the method further comprises:
receiving a group reservation operation for the target interaction group;
and responding to the received closing operation of the interactive interface, and displaying the target interaction inlet corresponding to the target interaction group in a preset area of a user interface.
10. A song interaction apparatus, characterized in that the apparatus comprises:
the song playing interface comprises at least one candidate interaction entrance, and different candidate interaction entrances correspond to different interaction groups;
a trigger operation receiving module, configured to receive a trigger operation on a target interaction entry in at least one candidate interaction entry, where the target interaction entry corresponds to a target interaction group;
and the interactive interface display module is used for displaying an interactive interface corresponding to the target interactive group, and the interactive interface is used for displaying real-time interactive information sent by the interactive objects in the target interactive group.
11. A terminal, characterized in that the terminal comprises a processor and a memory; the memory stores at least one instruction for execution by the processor to implement the song interaction method of any of claims 1-9.
12. A computer-readable storage medium having stored thereon at least one instruction for execution by a processor to implement the song interaction method of any one of claims 1 to 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010450193.6A CN111628925B (en) | 2020-05-25 | 2020-05-25 | Song interaction method, device, terminal and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010450193.6A CN111628925B (en) | 2020-05-25 | 2020-05-25 | Song interaction method, device, terminal and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111628925A true CN111628925A (en) | 2020-09-04 |
CN111628925B CN111628925B (en) | 2023-11-14 |
Family
ID=72260695
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010450193.6A Active CN111628925B (en) | 2020-05-25 | 2020-05-25 | Song interaction method, device, terminal and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111628925B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113518253A (en) * | 2021-04-29 | 2021-10-19 | 广州酷狗计算机科技有限公司 | Song playing method and device, terminal equipment and storage medium |
CN114327221A (en) * | 2021-12-24 | 2022-04-12 | 杭州网易云音乐科技有限公司 | Lighting method, medium, device and computing equipment |
CN114625466A (en) * | 2022-03-15 | 2022-06-14 | 广州歌神信息科技有限公司 | Method and device for performing and controlling interaction of online song hall, equipment, medium and product |
CN114885200A (en) * | 2022-04-26 | 2022-08-09 | 北京达佳互联信息技术有限公司 | Message processing method and device, electronic equipment and computer readable storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104092596A (en) * | 2014-01-20 | 2014-10-08 | 腾讯科技(深圳)有限公司 | Music user group management method, device and system |
CN105897867A (en) * | 2016-03-29 | 2016-08-24 | 乐视控股(北京)有限公司 | Share processing method of interaction information, vehicle terminal, server and system |
CN106231436A (en) * | 2016-08-30 | 2016-12-14 | 乐视控股(北京)有限公司 | Message treatment method and processing means |
CN106341695A (en) * | 2016-08-31 | 2017-01-18 | 腾讯数码(天津)有限公司 | Interaction method, device and system of live streaming room |
CN110209871A (en) * | 2019-06-17 | 2019-09-06 | 广州酷狗计算机科技有限公司 | Song comments on dissemination method and device |
US20200021545A1 (en) * | 2017-08-02 | 2020-01-16 | Tencent Technology (Shenzhen) Company Limited | Method, device and storage medium for interactive message in video page |
-
2020
- 2020-05-25 CN CN202010450193.6A patent/CN111628925B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104092596A (en) * | 2014-01-20 | 2014-10-08 | 腾讯科技(深圳)有限公司 | Music user group management method, device and system |
CN105897867A (en) * | 2016-03-29 | 2016-08-24 | 乐视控股(北京)有限公司 | Share processing method of interaction information, vehicle terminal, server and system |
CN106231436A (en) * | 2016-08-30 | 2016-12-14 | 乐视控股(北京)有限公司 | Message treatment method and processing means |
CN106341695A (en) * | 2016-08-31 | 2017-01-18 | 腾讯数码(天津)有限公司 | Interaction method, device and system of live streaming room |
US20190124400A1 (en) * | 2016-08-31 | 2019-04-25 | Tencent Technology (Shenzhen) Company Limited | Interactive method, apparatus, and system in live room |
US20200021545A1 (en) * | 2017-08-02 | 2020-01-16 | Tencent Technology (Shenzhen) Company Limited | Method, device and storage medium for interactive message in video page |
CN110209871A (en) * | 2019-06-17 | 2019-09-06 | 广州酷狗计算机科技有限公司 | Song comments on dissemination method and device |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113518253A (en) * | 2021-04-29 | 2021-10-19 | 广州酷狗计算机科技有限公司 | Song playing method and device, terminal equipment and storage medium |
CN114327221A (en) * | 2021-12-24 | 2022-04-12 | 杭州网易云音乐科技有限公司 | Lighting method, medium, device and computing equipment |
CN114625466A (en) * | 2022-03-15 | 2022-06-14 | 广州歌神信息科技有限公司 | Method and device for performing and controlling interaction of online song hall, equipment, medium and product |
CN114625466B (en) * | 2022-03-15 | 2023-12-08 | 广州歌神信息科技有限公司 | Interactive execution and control method and device for online singing hall, equipment, medium and product |
CN114885200A (en) * | 2022-04-26 | 2022-08-09 | 北京达佳互联信息技术有限公司 | Message processing method and device, electronic equipment and computer readable storage medium |
CN114885200B (en) * | 2022-04-26 | 2024-01-02 | 北京达佳互联信息技术有限公司 | Message processing method, device, electronic equipment and computer readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN111628925B (en) | 2023-11-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110267067B (en) | Live broadcast room recommendation method, device, equipment and storage medium | |
CN111327953B (en) | Live broadcast voting method and device and storage medium | |
CN110061900B (en) | Message display method, device, terminal and computer readable storage medium | |
CN110278464B (en) | Method and device for displaying list | |
CN111050189B (en) | Live broadcast method, device, equipment and storage medium | |
CN112118477B (en) | Virtual gift display method, device, equipment and storage medium | |
CN109451343A (en) | Video sharing method, apparatus, terminal and storage medium | |
CN109327608B (en) | Song sharing method, terminal, server and system | |
US20220191557A1 (en) | Method for displaying interaction data and electronic device | |
CN111628925B (en) | Song interaction method, device, terminal and storage medium | |
CN112764608B (en) | Message processing method, device, equipment and storage medium | |
CN110248236B (en) | Video playing method, device, terminal and storage medium | |
CN109302385A (en) | Multimedia resource sharing method, device and storage medium | |
CN113490010B (en) | Interaction method, device and equipment based on live video and storage medium | |
CN113395566B (en) | Video playing method and device, electronic equipment and computer readable storage medium | |
CN110750734A (en) | Weather display method and device, computer equipment and computer-readable storage medium | |
CN114245218B (en) | Audio and video playing method and device, computer equipment and storage medium | |
CN107896337B (en) | Information popularization method and device and storage medium | |
CN113411680A (en) | Multimedia resource playing method, device, terminal and storage medium | |
CN113204671A (en) | Resource display method, device, terminal, server, medium and product | |
CN111031391A (en) | Video dubbing method, device, server, terminal and storage medium | |
CN110337042B (en) | Song on-demand method, on-demand order processing method, device, terminal and medium | |
CN111399796B (en) | Voice message aggregation method and device, electronic equipment and storage medium | |
CN112559795A (en) | Song playing method, song recommending method, device and system | |
CN111984871A (en) | Friend recommendation method, friend recommendation display method, device and equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |