CN116931773A - Information processing method and device and electronic equipment - Google Patents
Information processing method and device and electronic equipment Download PDFInfo
- Publication number
- CN116931773A CN116931773A CN202310964010.6A CN202310964010A CN116931773A CN 116931773 A CN116931773 A CN 116931773A CN 202310964010 A CN202310964010 A CN 202310964010A CN 116931773 A CN116931773 A CN 116931773A
- Authority
- CN
- China
- Prior art keywords
- interface
- target
- displaying
- information
- displayed
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 31
- 238000003672 processing method Methods 0.000 title claims abstract description 19
- 230000002452 interceptive effect Effects 0.000 claims abstract description 42
- 238000000034 method Methods 0.000 claims abstract description 37
- 230000004044 response Effects 0.000 claims abstract description 27
- 238000004590 computer program Methods 0.000 claims description 9
- 230000003190 augmentative effect Effects 0.000 claims description 8
- 230000008569 process Effects 0.000 claims description 5
- 239000000758 substrate Substances 0.000 claims 1
- 230000000694 effects Effects 0.000 description 14
- 230000006870 function Effects 0.000 description 12
- 238000004891 communication Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 5
- 230000003993 interaction Effects 0.000 description 3
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 229910003460 diamond Inorganic materials 0.000 description 2
- 239000010432 diamond Substances 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 235000015243 ice cream Nutrition 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000001816 cooling Methods 0.000 description 1
- 238000005034 decoration Methods 0.000 description 1
- 235000013305 food Nutrition 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04802—3D-info-object: information is displayed on the internal or external surface of a three dimensional manipulable object, e.g. on the faces of a cube that can be rotated by the user
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the invention discloses an information processing method, an information processing device and electronic equipment. One embodiment of the method comprises the following steps: in response to an information acquisition operation for a target three-dimensional entity, playing a target video on a first interface, and displaying a target entry on the first interface; and responding to the triggering operation aiming at the target entrance, displaying a second interface, wherein the second interface is displayed with interactive resource information and interactive control. Thus, a new information processing mode is provided.
Description
Technical Field
The disclosure relates to the field of computer technology, and in particular, to an information processing method, an information processing device and electronic equipment.
Background
With the development of computers, users can implement various functions using electronic devices. For example, a user may view a variety of information through an electronic device.
In some scenarios, with the rapid development of mobile internet and terminal devices. People can acquire information through terminal equipment, interact with other people and the like.
Disclosure of Invention
This disclosure is provided in part to introduce concepts in a simplified form that are further described below in the detailed description. This disclosure is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
In a first aspect, an embodiment of the present disclosure provides an information processing method, including: in response to an information acquisition operation for a target three-dimensional entity, playing a target video on a first interface, and displaying a target entry on the first interface; and responding to the triggering operation aiming at the target entrance, displaying a second interface, wherein the second interface is displayed with interactive resource information and interactive control.
In a second aspect, an embodiment of the present disclosure provides an information processing apparatus including: a first display unit for playing a target video at a first interface in response to an information acquisition operation for a target three-dimensional entity, and displaying a target entry at the first interface; the second display unit is used for responding to the triggering operation aiming at the target entrance and displaying a second interface, wherein the second interface is displayed with interactive resource information and interactive controls.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: one or more processors; and a storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the information processing method as described in the first aspect.
In a fourth aspect, an embodiment of the present disclosure provides a computer-readable medium, on which a computer program is stored, which when executed by a processor, implements the steps of the information processing method according to the first aspect.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
FIG. 1 is a flow chart of one embodiment of an information processing method according to the present disclosure;
fig. 2A and 2B are schematic views of an application scenario of the information processing method according to the present disclosure;
fig. 3A, 3B, and 3C are schematic views of one application scenario of the information processing method according to the present disclosure;
FIG. 4 is a schematic diagram of one application scenario of an information processing method according to the present disclosure;
FIG. 5 is a schematic structural view of one embodiment of an information processing apparatus according to the present disclosure;
FIG. 6 is an exemplary system architecture to which the information processing method of one embodiment of the present disclosure may be applied;
fig. 7 is a schematic view of a basic structure of an electronic device provided according to an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order and/or performed in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
Referring to fig. 1, a flow of one embodiment of an information processing method according to the present disclosure is shown. The information processing method as shown in fig. 1 includes the steps of:
in step 101, in response to an information acquisition operation for a target three-dimensional entity, a target video is played on a first interface, and a target entry is displayed on the first interface.
In this embodiment, an execution subject (e.g., a server and/or a terminal device) of the information processing method may play a target video at a first interface in response to an information acquisition operation for a target three-dimensional entity, and display a target entry at the first interface.
The target three-dimensional entity may be a three-dimensional entity model. The specific style of the target three-dimensional entity can be set according to actual needs, and is not limited herein. As an example, the target three-dimensional entity may be a model of a heart shape, may be a real object in a real environment. The target three-dimensional entity may be virtually any three-dimensional model, and may be referred to as a target three-dimensional entity for ease of illustration in the disclosure.
Here, the above-described information acquisition operation may be used to acquire information of a target three-dimensional entity. The acquired information may include, but is not limited to, image information, text information, identification information, three-dimensional structure information.
Optionally, the above information obtaining operation may include, but is not limited to, at least one of: scanning operation, image capturing operation.
As an example, a scanning operation may be performed at a scanning interface aimed at a target three-dimensional entity; and playing the target video on the first interface based on the information obtained by the scanning operation.
Here, the target video may include, but is not limited to, at least one of: text, images, video, animation.
Here, the target video may correspond to the above-described target three-dimensional entity (for example, the target three-dimensional entity is an loving model, and the target video may be a video including a loving image). The target video corresponding to the target three-dimensional entity may include the target three-dimensional entity image, including an abstract image of the target three-dimensional entity image. The target video has a function of interacting with the user. As an example, the target video may be an animated image, an augmented reality image, or a virtual reality image.
Here, the target portal may be displayed on the first interface, where the target portal may be a portal corresponding to the second interface, and may be used to trigger displaying the second interface.
Step 102, in response to the triggering operation for the target portal, displaying a second interface.
Here, the second interface is displayed with interactive resource information and interactive controls.
The second interface is an interface for acquiring resources. The resource may be a virtual resource or a physical resource. The second interface may be displayed full screen or half screen.
The interactive resource may indicate that the interactive resource is enabled. For example, available virtual resources, sharable virtual resources are obtained. The interactive resource information may include information such as type, name, number, and acquisition mode guidance of the interactive resource.
The interactive control can operate, such as get or share, on the interactive resources based on the prompt of the interactive resource information.
As an example, as shown in fig. 2A, the three-dimensional pyramid model is being scanned in a scan box 201 of the exemplary display electronic device of fig. 2A. On the electronic device screen shown in fig. 2A, a three-dimensional cone model image 202 may also be displayed.
As an example, referring to fig. 2B, fig. 2B illustrates the target video display area 204 described above. The target video displayed in the target video display area 204 may include a target three-dimensional cone model image and a decoration effect of the three-dimensional cone model image.
As an example, referring to fig. 2B, a user triggering the target portal 205 in fig. 2B may implement a page jump. The typeface (click view) shown on the target portal 205 may be understood as interface jump guide information.
For example, referring to fig. 3A, fig. 3A illustrates a second interface 302. The second interface 302 can display interactive controls, which can include a get control 3021, a share control 3022. The second interface 302 may display interactive resource information (e.g., add a cool to you's summer in fig. 2A).
It should be noted that, in the information processing manner provided in this embodiment, by operating on a target three-dimensional entity, obtaining target three-dimensional entity information, playing a target video on a first interface based on the operation, displaying a target entry on the first interface, and displaying a second interface in response to the operation on the target entry, where the second interface includes an interaction control and interaction resource information. Therefore, through setting the target three-dimensional entity which attracts the user and adopting a target video mode, the participation of the user is improved, meanwhile, the target entrance of the interface is presented, and the second interface can display the interactive resource information and the interactive control, so that the interaction and resource acquisition efficiency can be improved.
In some embodiments, the interactive control is a sharing control. The method further comprises the steps of: and responding to the triggering operation for the sharing control, and sharing the interactive resources corresponding to the interactive resource information.
The types and the number of the interactive resources can be set according to actual application scenes, and are not limited herein.
For example, if user A scans a target three-dimensional pyramidal entity using an electronic device, the second interface 302 described above may be presented. User a clicks on the share control 3022 to share virtual resources. For example, user a shares an ice cream receipt to user B, who may go to a designated merchant to receive an ice cream.
In some embodiments, the interactive control includes a resource retrieval control, and the method further includes: and responding to the operation of the resource domain control in the second interface, and distributing the virtual resource to the first account.
Here, the first account is an account of a user of a target three-dimensional entity preset under the scan line.
As an example, if the user clicks on the get control 3021 in fig. 3A, it may be considered that the resource acquisition operation is performed. Thus, the execution subject may allocate virtual resources to the first account.
In some embodiments, the information acquisition operation includes a scanning operation.
In some embodiments, the step 101 may include: the responding to the information acquisition operation aiming at the target three-dimensional entity plays the target video on the first interface and displays the target entrance on the first interface, and the method comprises the following steps: and in response to the scanning operation for the target three-dimensional entity, playing the augmented reality scene image associated with the target three-dimensional entity on the first interface.
In some embodiments, the method further comprises: and responding to the interactive operation aiming at the augmented reality scene image, and displaying the operation result of the interactive operation in real time.
The augmented reality scene image may be a superposition of the actual scanned/photographed scene image and the virtual scene image. As an example, a user performs an interactive operation with respect to an augmented reality scene image, for example, triggers a control in the augmented reality scene image, and a preset virtual special effect may be displayed in the actual scene image as an operation result.
In some embodiments, the method may further comprise: and responding to the triggering operation for the target entrance, and displaying a third interface.
It will be appreciated that the presentation of the two interfaces may be presented simultaneously, e.g., side-by-side; or may be one interface partially or completely overlapping another interface.
Optionally, a second interface is displayed on the third interface, and the second interface partially covers the third interface.
In some embodiments, the above method may include: responding to a triggering operation aiming at a target entrance, and generating an interface display request according to the target three-dimensional entity image; and displaying the third interface and the second interface according to the interface content returned based on the interface display request.
Here, the interface presentation request includes a target three-dimensional entity identification.
The server responds to the interface display request, determines an interface identifier corresponding to the target three-dimensional entity identifier according to the corresponding relation between the three-dimensional entity identifier and the interface, and returns corresponding interface content.
It is understood that the screen display of the electronic device may be implemented using layers. The layer located at the upper layer has a display priority. The second interface is shown in the upper layer and the third interface is shown in the lower layer. The second interface may obscure a portion of the third interface. The occluded part may be referred to as a first region and the non-occluded part as a second region. The second region may include an identification of the third interface.
Optionally, the second area where the third interface is not occluded includes a third interface identifier.
For example, referring to fig. 3A, the second interface 302 in fig. 3A obscures a portion of the third interface 301. The portion of the third interface that is not occluded includes a search box.
Optionally, in some scenarios, the target three-dimensional entity may be presented in association with offline preset activity information.
As an example, a three-dimensional cone model may be set up in the field on-line and activity information (e.g., cooling in fig. 2A) is presented in association with the three-dimensional cone model. When using electronics, a three-dimensional cone model arranged in the field under a scan line is scanned. The three-dimensional pyramid model and the active typeface may be displayed in a screen of an electronic device.
As an example, as shown in fig. 2A, the three-dimensional pyramid model is being scanned in a scan box 201 of the exemplary display electronic device of fig. 2A. On the electronic device screen shown in fig. 2A, a three-dimensional cone model image 202 may also be displayed.
Therefore, the offline activity and the online activity can be associated, the user firstly sees the activity information offline, and when seeing the jump guide information associated with the target video, the user is more prone to opening and viewing the third interface, viewing the activity condition, participating in the activity and acquiring the resource.
It should be noted that, by displaying the target three-dimensional entity offline and scanning the target three-dimensional entity, a target three-dimensional entity image is obtained, a target video corresponding to the target three-dimensional entity is displayed, a target entry associated with the target video is displayed, a third interface and a second interface are displayed in response to a trigger operation on the target entry, the third interface corresponds to an offline activity (i.e., the offline preset target three-dimensional entity), thereby setting the target three-dimensional entity attracting the user offline, and improving the user's sense of participation by adopting the target video, simultaneously presenting the target entry, improving the probability of the user performing the jump operation, improving the conversion rate of the offline online activity, enabling the user jump interface to open the third interface, and participating in the activity.
In some embodiments, the method further comprises at least one of: controlling a second interface to completely cover the third interface in response to a first sliding operation for the second interface; and canceling display of the second interface in response to a second sliding operation for the second interface.
Here, the sliding direction of the first sliding operation may be set according to the actual application scenario, and is not limited herein. The sliding direction of the second sliding operation may be set according to the actual application scenario, and is not limited herein.
As an example, referring to fig. 3A and 3B, performing the sliding operation on the second interface 302 shown in fig. 3A may enlarge the interface area of the second interface. As an example, the display effect of the second interface as in fig. 3B, that is, the interface area of the second interface becomes large.
As an example, if a slide-up operation is performed at the second interface 302 shown in fig. 3A, the display of the second interface may be canceled. If a third interface is presented simultaneously with the second interface, then the third interface is displayed in its entirety, as shown in FIG. 3C. The deployment control 303 is shown in fig. 3C, and triggering the deployment control 303 may trigger the presentation of the second interface.
Therefore, the user can conveniently operate the second interface according to the needs, and the operation efficiency is improved.
In some embodiments, the method may further comprise: and loading the second interface in response to loading the third interface being completed.
Here, the loading order of the online acquisition page and the second interface may be that the third interface is loaded first and then the second interface is loaded.
For example, when it is confirmed that the loading of the contents such as the diamond bit and the feed stream of the third interface is completed, whether the second interface needs to be loaded is judged, and if the second interface needs to be loaded, the second interface can be loaded. And if the second interface does not need to be loaded, displaying a third interface.
Therefore, the screen is free from blank, and the integrity and rationality of interface display are ensured.
In some embodiments, the above method may further comprise: and displaying a third interface in response to a closing operation for the second interface.
The third interface may be a video display interface, a merchandise display interface, or an interface for displaying video and merchandise. In some embodiments, the third interface is for displaying a media content stream associated with the target region. As an example, the above-described target area may be an area set by the user. The media content stream associated with the target area may be a media content stream associated with the target area, such as a feed stream. As shown in fig. 3C, a media content stream may be presented in presentation area 304 of fig. 3C.
Here, the second interface may be closed, and the closing operation may be performed before or after the triggering of the interactive control. The second interface may be closed when the second interface is in a smaller display area or when the second interface is in a larger display area.
For example, referring to FIG. 3B, FIG. 3B shows a close control 3023. Clicking on the close control 3023 may close the second interface, i.e., cease presentation of the second interface. Stopping the second interface may allow the third interface to be displayed in its entirety, as shown in fig. 3C. The deployment control 303 is shown in fig. 3C, and triggering the deployment control 303 may trigger the presentation of the second interface.
Therefore, when the second interface is not required to be displayed, the second interface can be closed to display the complete third interface, and a user can conveniently operate the third interface.
In some embodiments, the step 103 includes: determining whether the second interface is associated with a preset element; and responding to the association of the second interface and the preset element, and displaying the second interface according to the position of the preset element in the third interface.
Here, the preset element is an element in the third interface. Specifically, the display elements on the third interface, such as function buttons, function areas, navigation controls, navigation areas, pendants and the like displayed in the third interface
Here, any element of the third interface may be set as a preset element. The association relationship between the preset element and the second interface can be set in advance. And when the second interface is loaded, judging whether the second interface has associated preset elements or not. If there is no associated preset element, the second interface may be loaded in a manner that does not have a preset element (e.g., setting the resource acquisition to also display one third of the screen height). If the second interface also has an associated preset element. And displaying the second interface according to the display position of the preset element in the third interface.
According to the display position of the preset element in the third interface, the second interface is displayed, which can be set according to the actual application scene, and is not limited herein. For example, the second interface may be displayed adjacent to the display location.
The displaying the preset element according to the display position of the preset element may include, but is not limited to, at least one implementation manner of the following: the second interface does not block the preset performance displayed in the third interface; the second interface comprises pointing information, and the pointing information points to preset elements in the third interface; the gradual expansion of the second interface may be performed according to the position of the preset element. The pointing information may be a directional mark and/or text, such as an arrow having a directivity, where the pointing information points to a preset element in the third interface, i.e. the arrow points to the preset element.
Therefore, the display of the second interface can be associated with the preset element, and the association between the content of the second interface and the preset element is improved.
In some embodiments, the second interface is displayed outside the preset elements of the third interface, i.e., the second interface does not obscure the preset elements in the third interface.
As an example, referring to fig. 4, a preset element 401, that is, a diamond of a food, is shown in fig. 4 as a preset element. The second interface 302 does not obscure the preset element 401. That is to say, the second interface is an area or a display position, and is determined according to the position of the preset element.
In some embodiments, the second interface includes pointing information. As an example, referring to fig. 4, it is shown in fig. 4 that the second interface 302 may include pointing information 402.
And when the third interface and the second interface are displayed, pointing information in the second interface points to preset elements in the third interface.
As an example, referring to fig. 4, pointing information 402 of the second interface 302, and a preset element 401 of the third interface are shown in fig. 4.
In some embodiments, the interface of the second interface is expanded and presented in a gradual expansion, the gradual expansion process including at least one of: the second interface is unfolded from the preset elements in the third interface; and when the distance between the second interface and the preset element in the third interface is smaller than the preset distance, stopping expanding.
As an example, as shown in fig. 4, if the second interface is gradually expanded from top to bottom, the second interface is expanded downward from the height of the preset element 401; if the second interface expands gradually from bottom to top, the second interface expansion value presets the height of the element 401, and the rising is stopped.
With further reference to fig. 5, as an implementation of the method shown in the foregoing figures, the present disclosure provides an embodiment of an information processing apparatus, which corresponds to the method embodiment shown in fig. 1, and which is particularly applicable to various electronic devices.
As shown in fig. 5, the information processing apparatus of the present embodiment includes: a first display unit 501 and a second display unit 502. The first display unit is used for responding to the information acquisition operation aiming at the target three-dimensional entity, playing the target video on the first interface and displaying the target entrance on the first interface; the second display unit is used for responding to the triggering operation aiming at the target entrance and displaying a second interface, wherein the second interface is displayed with interactive resource information and interactive controls.
In this embodiment, the specific processing of the first display unit 501 and the second display unit 502 of the information processing apparatus and the technical effects thereof may refer to the description of the steps 101 and 102 in the corresponding embodiment of fig. 1, and are not repeated here.
In some embodiments, the interactive control is a sharing control; the apparatus is further for: and responding to the triggering operation for the sharing control, and sharing the interactive resources corresponding to the interactive resource information.
In some embodiments, the information acquisition operation comprises a scanning operation; and in response to the information acquisition operation for the target three-dimensional entity, playing the target video on the first interface, and displaying the target entry on the first interface, including: and in response to the scanning operation for the target three-dimensional entity, playing the augmented reality scene image associated with the target three-dimensional entity on the first interface.
In some embodiments, the apparatus is further to: and responding to the triggering operation for the target entrance, and displaying a third interface, wherein a second interface is displayed on the upper layer of the third interface, and the second interface covers a partial area of the third interface.
In some embodiments, the apparatus is further to: controlling a second interface to completely cover the third interface in response to a first sliding operation for the second interface; and canceling display of the second interface in response to a second sliding operation for the second interface.
In some embodiments, in response to a closing operation for the second interface, a third interface is displayed, the third interface being for displaying a media content stream associated with a target area.
In some embodiments, the displaying the second interface in response to a trigger operation for the target portal includes: determining whether the second interface is associated with a preset element, wherein the preset element is an element in a third interface; and responding to the association of the second interface and the preset element, and displaying the second interface according to the position of the preset element in the third interface.
In some embodiments, the second interface is displayed outside the preset element.
In some embodiments, the second interface includes pointing information; and when the third interface and the second interface are displayed, pointing information in the second interface points to preset elements in the third interface.
In some embodiments, the interface of the second interface is expanded and presented in a gradual expansion, the gradual expansion process including at least one of: the second interface is unfolded from the preset elements in the third interface; and when the distance between the second interface and the preset element in the third interface is smaller than the preset distance, stopping expanding.
Referring to fig. 6, fig. 6 illustrates an exemplary system architecture in which an information processing method of an embodiment of the present disclosure may be applied.
As shown in fig. 6, the system architecture may include terminal devices 601, 602, 603, a network 604, and a server 605. The network 604 is used as a medium to provide communication links between the terminal devices 601, 602, 603 and the server 605. The network 604 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
The terminal devices 601, 602, 603 may interact with the server 605 via the network 604 to receive or send messages or the like. Various client applications, such as a web browser application, a search class application, a news information class application, may be installed on the terminal devices 601, 602, 603. The client application in the terminal device 601, 602, 603 may receive the instruction of the user and perform the corresponding function according to the instruction of the user, for example, adding the corresponding information in the information according to the instruction of the user.
The terminal devices 601, 602, 603 may be hardware or software. When the terminal devices 601, 602, 603 are hardware, they may be various electronic devices having a display screen and supporting web browsing, including but not limited to smartphones, tablet computers, electronic book readers, MP3 players (Moving Picture Experts Group Audio Layer III, dynamic video expert compression standard audio plane 3), MP4 (Moving Picture Experts Group Audio Layer IV, dynamic video expert compression standard audio plane 4) players, laptop and desktop computers, and the like. When the terminal devices 601, 602, 603 are software, they can be installed in the above-listed electronic devices. Which may be implemented as multiple software or software modules (e.g., software or software modules for providing distributed services) or as a single software or software module. The present invention is not particularly limited herein.
The server 605 may be a server that provides various services, for example, receives information acquisition requests sent by the terminal devices 601, 602, 603, and acquires presentation information corresponding to the information acquisition requests in various ways according to the information acquisition requests. And related data showing the information is transmitted to the terminal devices 601, 602, 603.
It should be noted that the information processing method provided by the embodiments of the present disclosure may be performed by a terminal device, and accordingly, the information processing apparatus may be provided in the terminal devices 601, 602, 603. Further, the information processing method provided by the embodiment of the present disclosure may also be executed by the server 605, and accordingly, the information processing apparatus may be provided in the server 605.
It should be understood that the number of terminal devices, networks and servers in fig. 6 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Referring now to fig. 7, a schematic diagram of an electronic device (e.g., a terminal device or server in fig. 6) suitable for use in implementing embodiments of the present disclosure is shown. The terminal devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, and stationary terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 7 is merely an example and should not be construed to limit the functionality and scope of use of the disclosed embodiments.
As shown in fig. 7, the electronic device may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 701, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 702 or a program loaded from a storage means 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data required for the operation of the electronic device 700 are also stored. The processing device 701, the ROM 702, and the RAM 703 are connected to each other through a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.
In general, the following devices may be connected to the I/O interface 705: input devices 706 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, and the like; an output device 707 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 708 including, for example, magnetic tape, hard disk, etc.; and a communication device 709. The communication means 709 may allow the electronic device to communicate with other devices wirelessly or by wire to exchange data. While fig. 7 shows an electronic device having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via communication device 709, or installed from storage 708, or installed from ROM 702. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 701.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some implementations, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: in response to an information acquisition operation for a target three-dimensional entity, playing a target video on a first interface, and displaying a target entry on the first interface; and responding to the triggering operation aiming at the target entrance, displaying a second interface, wherein the second interface is displayed with interactive resource information and interactive control.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including, but not limited to, an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. The name of the unit does not in any way constitute a limitation of the unit itself, for example, the display unit may also be described as "a unit displaying the second interface".
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.
Claims (13)
1. An information processing method, characterized by comprising:
in response to an information acquisition operation for a target three-dimensional entity, playing a target video on a first interface, and displaying a target entry on the first interface;
and responding to the triggering operation aiming at the target entrance, displaying a second interface, wherein the second interface is displayed with interactive resource information and interactive control.
2. The method of claim 1, wherein the interactive control is a sharing control; and
the method further comprises the steps of:
and responding to the triggering operation for the sharing control, and sharing the interactive resources corresponding to the interactive resource information.
3. The method of claim 1, wherein the information acquisition operation comprises a scanning operation; and
the responding to the information acquisition operation aiming at the target three-dimensional entity plays the target video on the first interface and displays the target entrance on the first interface, and the method comprises the following steps:
and in response to the scanning operation for the target three-dimensional entity, playing the augmented reality scene image associated with the target three-dimensional entity on the first interface.
4. The method according to claim 1, wherein the method further comprises:
and responding to the triggering operation for the target entrance, and displaying a third interface, wherein a second interface is displayed on the upper layer of the third interface, and the second interface covers a partial area of the third interface.
5. The method according to claim 1, wherein the method further comprises:
controlling a second interface to completely cover the third interface in response to a first sliding operation for the second interface;
and canceling display of the second interface in response to a second sliding operation for the second interface.
6. The method of claim 1, wherein the step of determining the position of the substrate comprises,
and in response to closing operation of the second interface, displaying a third interface, wherein the third interface is used for displaying the media content stream associated with the target area.
7. The method of claim 1, wherein the displaying a second interface in response to a trigger operation for a target portal comprises:
determining whether the second interface is associated with a preset element, wherein the preset element is an element in a third interface;
and responding to the association of the second interface and the preset element, and displaying the second interface according to the position of the preset element in the third interface.
8. The method of claim 7, wherein the second interface is displayed outside of the preset elements.
9. The method of claim 7, wherein the second interface includes pointing information;
and when the third interface and the second interface are displayed, pointing information in the second interface points to preset elements in the third interface.
10. The method of claim 7, wherein the interface of the second interface is expanded and displayed in a gradual expansion manner, the gradual expansion process comprising at least one of:
the second interface is unfolded from the preset elements in the third interface;
and when the distance between the second interface and the preset element in the third interface is smaller than the preset distance, stopping expanding.
11. An information processing apparatus, characterized by comprising:
a first display unit for playing a target video at a first interface in response to an information acquisition operation for a target three-dimensional entity, and displaying a target entry at the first interface;
the second display unit is used for responding to the triggering operation aiming at the target entrance and displaying a second interface, wherein the second interface is displayed with interactive resource information and interactive controls. .
12. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs,
when executed by the one or more processors, causes the one or more processors to implement the method of any of claims 1-10.
13. A computer readable medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the method according to any one of claims 1-10.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310964010.6A CN116931773A (en) | 2023-08-01 | 2023-08-01 | Information processing method and device and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310964010.6A CN116931773A (en) | 2023-08-01 | 2023-08-01 | Information processing method and device and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116931773A true CN116931773A (en) | 2023-10-24 |
Family
ID=88375295
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310964010.6A Pending CN116931773A (en) | 2023-08-01 | 2023-08-01 | Information processing method and device and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116931773A (en) |
-
2023
- 2023-08-01 CN CN202310964010.6A patent/CN116931773A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11178448B2 (en) | Method, apparatus for processing video, electronic device and computer-readable storage medium | |
CN111970571B (en) | Video production method, device, equipment and storage medium | |
CN111510760A (en) | Video information display method and device, storage medium and electronic equipment | |
CN114363686B (en) | Method, device, equipment and medium for publishing multimedia content | |
WO2023051483A1 (en) | Live-streaming interaction method and apparatus, and readable medium and electronic device | |
US20240319840A1 (en) | Data interaction method, apparatus, electronic device, storage medium, and program product | |
CN114025225B (en) | Bullet screen control method and device, electronic equipment and storage medium | |
CN115278275B (en) | Information display method, apparatus, device, storage medium, and program product | |
CN111652675A (en) | Display method and device and electronic equipment | |
CN110456957B (en) | Display interaction method, device, equipment and storage medium | |
WO2024188089A1 (en) | Special effect information display method and apparatus, electronic device, and storage medium | |
CN114489891A (en) | Control method, system, device, readable medium and equipment of cloud application program | |
WO2023246760A1 (en) | Search page presentation method and apparatus, and device, medium and product | |
CN117244249A (en) | Multimedia data generation method and device, readable medium and electronic equipment | |
CN112256221A (en) | Information display method and device and electronic equipment | |
US20230362460A1 (en) | Dynamically generated interactive video content | |
CN114398135B (en) | Interaction method, device, electronic equipment, storage medium and program product | |
CN112307393A (en) | Information issuing method and device and electronic equipment | |
EP4207775A1 (en) | Method and apparatus for determining object addition mode, electronic device, and medium | |
CN112004049B (en) | Double-screen different display method and device and electronic equipment | |
CN116931773A (en) | Information processing method and device and electronic equipment | |
CN115857762A (en) | Question answering method, question answering device, question answering equipment, question answering medium and program product | |
WO2022161329A1 (en) | Work display method and apparatus, and electronic device and storage medium | |
CN118042205A (en) | Interface interaction method, device, medium, electronic equipment and program product | |
CN118034551A (en) | Display method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |