CN111652675A - Display method and device and electronic equipment - Google Patents
Display method and device and electronic equipment Download PDFInfo
- Publication number
- CN111652675A CN111652675A CN202010422376.7A CN202010422376A CN111652675A CN 111652675 A CN111652675 A CN 111652675A CN 202010422376 A CN202010422376 A CN 202010422376A CN 111652675 A CN111652675 A CN 111652675A
- Authority
- CN
- China
- Prior art keywords
- house
- image
- real
- dimensional model
- area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 47
- 230000003190 augmentative effect Effects 0.000 claims abstract description 29
- 230000004044 response Effects 0.000 claims description 12
- 238000004590 computer program Methods 0.000 claims description 9
- 238000009877 rendering Methods 0.000 claims description 7
- 230000006870 function Effects 0.000 description 10
- 238000004891 communication Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 238000012217 deletion Methods 0.000 description 7
- 230000037430 deletion Effects 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 230000009471 action Effects 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Shopping interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/16—Real estate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Strategic Management (AREA)
- Marketing (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Computer Graphics (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Economics (AREA)
- Development Economics (AREA)
- Software Systems (AREA)
- Architecture (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Human Resources & Organizations (AREA)
- Primary Health Care (AREA)
- Processing Or Creating Images (AREA)
Abstract
The embodiment of the disclosure discloses a display method, a display device and electronic equipment. One embodiment of the method comprises: real house images are collected and displayed in real time; and displaying a first augmented image at the target area of the displayed real-house image, wherein the first augmented image comprises house introduction information corresponding to the real-house image. Therefore, a new display mode can be provided.
Description
Technical Field
The present disclosure relates to the field of internet technologies, and in particular, to a display method and apparatus, and an electronic device.
Background
With the development of the internet, users increasingly use terminal devices to realize various functions. For example, a user can browse and search house source information through the terminal device, and therefore the user can obtain more house source information without going home. Or, the user can screen out the house source of the heart instrument of the user through the house source information on the network, and the house source is bought to the broker on the spot.
Disclosure of Invention
This disclosure is provided to introduce concepts in a simplified form that are further described below in the detailed description. This disclosure is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
The embodiment of the disclosure provides a display method, a display device and electronic equipment.
In a first aspect, an embodiment of the present disclosure provides a display method, where the method includes: real house images are collected and displayed in real time; displaying a first augmented image at a target area of the displayed real-house image, wherein the first augmented image comprises house introduction information corresponding to the real-house image.
In a second aspect, an embodiment of the present disclosure provides a display device, including: the first display unit is used for acquiring and displaying real house images in real time; and the second display unit is used for displaying a first augmented image at the target area of the displayed real house image, wherein the first augmented image comprises house introduction information corresponding to the real house image.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: one or more processors; storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to carry out the presentation method as described in the first aspect.
In a fourth aspect, the disclosed embodiments provide a computer readable medium, on which a computer program is stored, which when executed by a processor, implements the steps of the presentation method as described in the first aspect.
According to the display method, the display device and the electronic equipment, the real house image is collected and displayed in real time, then the first enhanced image corresponding to the target information three-dimensional model is displayed in the displayed target area of the real house according to the determined target area of the real house image, and the first enhanced image can comprise house introduction information; therefore, a new display mode can be provided. By the novel display mode, the enhanced image of the house introduction information can be displayed on the real house image acquired in real time, so that the house introduction information can be provided while the house is displayed for a user. Therefore, more house information can be provided for the user to judge, the efficiency of obtaining the house information by the user can be improved, and the time cost of the user is saved.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and features are not necessarily drawn to scale.
FIG. 1 is a flow chart diagram illustrating one embodiment of a method in accordance with the present disclosure;
FIG. 2 is a flow chart of an exemplary application scenario illustrating a method according to the present disclosure;
FIG. 3 is a flow chart of yet another exemplary application scenario of a presentation method according to the present disclosure;
FIG. 4 is a flow chart of yet another exemplary application scenario of a presentation method according to the present disclosure;
FIG. 5 is a schematic structural diagram of one embodiment of a display device according to the present disclosure;
FIG. 6 is an exemplary system architecture to which the presentation method of one embodiment of the present disclosure may be applied;
fig. 7 is a schematic diagram of a basic structure of an electronic device provided according to an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
Referring to fig. 1, a flow of one embodiment of a presentation method according to the present disclosure is shown. The demonstration method as shown in fig. 1 comprises the following steps:
In this embodiment, an execution subject (for example, a terminal device) of the presentation method may acquire a real-house image in real time and present the real-house image.
In this embodiment, the execution main body may acquire a real-world image related to a house through a camera; and then, constructing a house three-dimensional model according to the real world image related to the house through the execution main body or a server end in communication connection with the execution main body.
In this embodiment, the execution subject may render the three-dimensional house model according to its pose (position and posture). By way of example, a three-dimensional image rendering pipeline technique may be employed to convert a three-dimensional model of a house into a two-dimensional image; and then displaying the two-dimensional image obtained by the conversion.
In this embodiment, the execution subject may present the first enhanced image at the presented target area of the display house.
Here, the first augmented image may include house introduction information corresponding to a real house image.
Here, the house introduction information may be used to introduce a house. The specific content included in the house introduction information may be preset according to an actual application scenario, and is not limited herein.
As an example, the house introduction information may include at least one of, but is not limited to: house area, house orientation, house layout (three-room-one-hall, two-room-one-hall, etc.).
Optionally, the house granularity indicated by the introduction information of the house introduction information is not limited herein. As an example, the house introduction information may introduce the relevant situation of a whole house (e.g., a three-room and one-room), and may also introduce the relevant situation of a certain room in the whole house (e.g., a living room in the three-room and one-room, a master bedroom, etc.).
In this embodiment, the executing body may display the first augmented image superimposed on the real-house image displayed in step 101 in response to detection of the indicating operation for indicating the position of the target region.
Here, the position of the overlay display may be a position where the target area is located.
Here, the house introduction information may be stored in correspondence with the house. The three-dimensional model of the information can be made and stored in advance. After the target house is determined, the house introduction information of the target house is combined with the information three-dimensional model, and the target information three-dimensional model can be obtained.
Here, the first enhanced image is an image corresponding to the target information three-dimensional model.
Here, the above-mentioned displaying of the first enhanced image may be implemented in various ways, and is not limited herein.
As an example, a position of the target area may be used as an anchor point, and a first augmented display image may be superimposed on the anchor point on a real-house image to be displayed later.
It should be noted that, in the display method provided in this embodiment, the real-house image is collected and displayed in real time, and then, according to the target area of the determined real-house image, the first augmented image corresponding to the target information three-dimensional model is displayed in the displayed target area of the real house, where the first augmented image may include house introduction information. Therefore, a new display mode can be provided, and the enhanced image of the house introduction information can be displayed on the real house image acquired in real time through the new display mode, so that the house introduction information can be provided while the house is displayed for the user. Therefore, more house information can be provided for the user to judge, the efficiency of obtaining the house information by the user can be improved, and the time cost of the user is saved.
Referring to fig. 2, an exemplary application scenario of the embodiment corresponding to fig. 1 is shown. Specifically, the user can use the terminal to acquire real house images in real time. As shown in fig. 2, the captured real-world images may include windows, sofas, etc. The terminal may then determine the wall surface above the sofa as the target area. Then, house introduction information "three rooms, one living room, 100 square meters" corresponding to the real house image is displayed as a first augmented image in the target area.
In some embodiments, the method further comprises: in response to detecting a delete operation for the first enhanced image, ceasing to present the first enhanced image.
Here, the implementation form of the deletion operation may be set according to an actual request scenario, and is not limited herein.
As an example, the deletion operation may include a click operation on a deletion control. The delete control may be displayed in association with the first enhanced image, for example to the upper right of the first enhanced image.
As an example, the above-described deletion operation may include a slide operation on the first enhanced image. And if the user slides the first enhanced image, deleting the first enhanced image.
It should be noted that, by stopping presenting the first augmented image in response to the deletion operation for the first augmented image, a way of stopping presenting the first augmented image may be provided, so that when the user desires to stop displaying the first augmented image (for example, the house introduction information is browsed completely, or the user desires to see a real house image without adding the first augmented image), an image desired by the user may be presented according to the user's requirement, thereby improving the information acquisition efficiency of the user.
In some embodiments, the target area may be specified by a user, or may be determined by the execution subject performing image recognition on a house image.
Optionally, the basis for performing the image recognition on the execution subject may be a real-time acquired image of a house, or a pre-established three-dimensional model of the house.
In some embodiments, the target area may be determined by the determining step. It should be noted that the determining step may be executed by the terminal, may also be executed by the server, and may also be executed by a combination of the terminal and the server.
In some embodiments, the determining step may include: and carrying out image recognition on the acquired real house image, and determining a blank area obtained by image recognition as a target area.
Taking fig. 2 as an example, the glass area of the window may be determined as the target area, or the wall area above the sofa may be determined as the target area.
It should be noted that, by determining the blank area as the target area, the blank area in the image can be reasonably utilized, and the information display efficiency of the screen can be improved. In addition, the large blank area of the real house image may make the house image look monotonous, the user experiences badly when browsing the house by using the enhanced display mode, the house introduction information is displayed in the target area, the blank area can be reasonably utilized, and the browsing experience of the user is improved.
In some embodiments, the blank area determined by the selection operation of the user may be determined as the target area in response to the selection operation of the user on the interface displaying the real-house image.
In some application scenarios, the determining step may comprise: region selection indication information may be presented on an interface presenting a real house image, wherein the region selection indication information may indicate that a user selects a region for presenting house introduction information.
It will be appreciated that the user action of region selection may indicate a point or a region.
For example, the user may click on a certain position in the screen, i.e., the action indicates a certain point, in which case the executing subject may determine the target area from the point indicated by the user action and the image representing the boundary in the image.
Referring to fig. 3, the area selection indication information 301 may include "please click on an area where you wish to show house introduction information". The user can click on the area where the house introduction information is expected to be displayed under the guidance of the indication information.
For another example, the user may draw a circle on the screen, that is, the action is only a certain area, in which case the execution subject may determine the target area according to the area of the circle drawn by the user.
Referring to fig. 4, the area selection instruction information 401 may include "please circle an area where you want to show house introduction information". The user may circle the area desired to present the house introduction under the direction of the indication information and circle the trajectory 402 as shown in fig. 4.
It should be noted that, the target area is determined according to the selection of the user, and the house introduction information may be displayed in the target area desired by the user. On one hand, because the area generally indicated by the user is the position which does not influence the user to watch the house, the target area specified by the user can be used for the personalized situation of the user, and the information display efficiency is improved. On the other hand, compared with the method for determining the blank area by carrying out image recognition on the image acquired in real time, the target area is specified by the user, so that huge calculation amount caused by the image recognition can be saved, the speed of determining the target area is improved, and the consumption of calculation resources caused by the image recognition is reduced.
In some embodiments, the determining step may include: and in response to the fact that the display area of the target area in the real-time acquired real-time house image is smaller than a preset area threshold value, re-determining the target area.
Here, after the target area is determined, the target area is a fixed area, such as a north wall area of a living room. The real-time collected real house image is determined according to the pose of the terminal, so that the real house image collected in real time comprises an image of a part of the target area. The image of the target area may also gradually become less to none as the camera moves. Therefore, although the area of the target region is constant, the display area of the target region may vary in the real-time captured real-house image.
Here, the area threshold may be determined according to actual conditions, and is not limited herein.
Here, the target area may be re-determined in various ways, for example, a blank area may be determined by using image recognition, and the re-determined blank area may be used as the target area; the user may also be asked to re-select the target area to re-determine the target area.
It should be noted that when the display area of the target area is smaller than the preset area threshold, the target area is determined again, so that the situation that the display area is small and insufficient for displaying the house introduction information can be avoided, the display rate of the house introduction information can be ensured, and the efficiency of obtaining information by a user is improved.
In some embodiments, the step 102 may include: obtaining a first video obtained by rendering the derived three-dimensional model; and displaying the first video.
Here, the derived three-dimensional model is obtained by combining the target information three-dimensional model and a house three-dimensional model, and the house three-dimensional model corresponds to the real house image.
It can be understood that the first video may be obtained by rendering the derived three-dimensional model according to the current pose of the execution subject.
Here, the three-dimensional house model may be constructed in real time based on real house images acquired in real time, may be pre-established, or may be obtained by modifying an initial model based on real house images acquired in real time.
In some application scenarios, the three-dimensional house model constructed by the real house image can be used to determine the three-dimensional house model corresponding to the real house image.
Here, the derived three-dimensional model includes a target information three-dimensional model and the house three-dimensional model, and the first video may include the real house image and the first augmented image.
Here, the execution subject may render the derived three-dimensional model according to its pose (position and posture). As an example, a three-dimensional image rendering pipeline technique may be employed to convert the derived three-dimensional model into a two-dimensional image; and then displaying the two-dimensional image obtained by the conversion. It is understood that the two-dimensional graphics obtained by conversion are arranged according to time to obtain the first video.
It should be noted that the derived three-dimensional model is obtained by combining the target information three-dimensional model and the house three-dimensional model, and the house three-dimensional model and the target information three-dimensional model can be fused, so that the relative position between the target information three-dimensional model and the house three-dimensional model is unchanged in the process of executing the main body pose transformation, the virtual sense of an image obtained by rendering based on the target information three-dimensional model can be reduced, and the reality sense of the displayed house introduction information in the house can be improved.
In some embodiments, the derived three-dimensional model may be generated by the following generating steps, which may include: acquiring a house three-dimensional model corresponding to the real house image; determining a combination position in the house three-dimensional model according to the position of the target area, wherein the combination position is the combination position of the target information three-dimensional model and the house three-dimensional model; and combining the target information three-dimensional model with the house three-dimensional model at the combination position to obtain a derivative three-dimensional model.
Here, the electronic device that executes the generation step may be the execution subject or may be another electronic device other than the execution subject.
It can be understood that the three-dimensional model of the house has a mapping relationship with the three-dimensional space. The spatial location has a mapping point in the three-dimensional model of the house, which can be understood as the location of the combination of the three-dimensional model of the target information and the three-dimensional model of the house.
Here, determining the binding location according to the target region may include: the center position of the target region is determined as a binding position.
Here, the above-mentioned target information three-dimensional model may be added to the house three-dimensional model at the above-mentioned joining position to obtain a derivative three-dimensional model.
It should be noted that, by determining a combination position according to the target area, and then combining the target information three-dimensional model with the house three-dimensional model at the combination position to obtain the derivative three-dimensional model, an accurate first derivative model can be obtained according to the position indicated by the user.
In some embodiments, the generating step may include: in response to determining to re-determine the target region, the derived three-dimensional model is updated based on the re-determined target region.
Here, the executing body of the generating step may determine the combined position in the positional three-dimensional model based on the target region again using the re-determined target region after the target region is re-determined in the determining step to update the derived three-dimensional model.
It should be noted that, by updating the derivative three-dimensional model based on the redetermined target area, the derivative three-dimensional model can be updated in time by using the new target area after the target area is redetermined, so that the display rate of the house introduction information is ensured, and the efficiency of the user for obtaining the information is improved.
With further reference to fig. 5, as an implementation of the methods shown in the above figures, the present disclosure provides an embodiment of a display apparatus, which corresponds to the embodiment of the method shown in fig. 1, and which may be applied in various electronic devices.
As shown in fig. 5, the display device of the present embodiment includes: a first display unit 501 and a second display unit 502; the first display unit is used for acquiring and displaying real house images in real time; and the second display unit is used for displaying a first augmented image at the target area of the displayed real house image, wherein the first augmented image comprises house introduction information corresponding to the real house image.
In this embodiment, specific processing of the first display unit 501 and the second display unit 502 of the display apparatus and technical effects thereof can refer to related descriptions of step 101 and step 102 in the corresponding embodiment of fig. 1, which are not repeated herein.
In some embodiments, the target region is determined by a determining step, wherein the determining step comprises: and carrying out image recognition on the acquired real house image, and determining a blank area obtained by image recognition as a target area.
In some embodiments, the target region is determined by a determining step, wherein the determining step comprises: displaying area selection indication information, wherein the area selection indication information is used for indicating a user to select an area for displaying house introduction information; and determining the area selected by the user as a target area.
In some embodiments, the determining step comprises: and in response to the fact that the display area of the target area in the real-time acquired real-time house image is smaller than a preset area threshold value, re-determining the target area.
In some embodiments, said presenting, at the target area of the presented real-house image, a first augmented image comprises: acquiring a first video obtained by rendering a derivative three-dimensional model, wherein the derivative three-dimensional model is obtained by combining the target information three-dimensional model and a house three-dimensional model corresponding to the real house image; and displaying the first video.
In some embodiments, the derived three-dimensional model is generated based on a generating step comprising: acquiring a house three-dimensional model corresponding to the displayed real house image; determining a combination position in the house three-dimensional model according to a target area, wherein the combination position is the combination position of the house three-dimensional model and the target information three-dimensional model; and combining the house three-dimensional model with the target information three-dimensional model at the combination position to obtain the derived three-dimensional model.
In some embodiments, the generating step comprises: in response to re-determining the target region, updating the derived three-dimensional model based on the re-determined target region.
In some embodiments, the apparatus further comprises a deletion unit for stopping presentation of the first enhanced image in response to detecting a deletion operation for the first enhanced image.
Referring to fig. 6, fig. 6 illustrates an exemplary system architecture to which the presentation method of one embodiment of the present disclosure may be applied.
As shown in fig. 6, the system architecture may include terminal devices 601, 602, 603, a network 604, and a server 606. The network 604 is used to provide a medium for communication links between the terminal devices 601, 602, 603 and the server 606. Network 604 may include various types of connections, such as wire, wireless communication links, or fiber optic cables, to name a few.
The terminal devices 601, 602, 603 may interact with a server 606 over a network 604 to receive or send messages or the like. The terminal devices 601, 602, 603 may have various client applications installed thereon, such as a web browser application, a search-type application, and a news-information-type application. The client application in the terminal device 601, 602, 603 may receive the instruction of the user, and complete the corresponding function according to the instruction of the user, for example, add the corresponding information in the information according to the instruction of the user.
The terminal devices 601, 602, 603 may be hardware or software. When the terminal devices 601, 602, 603 are hardware, they may be various electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablet computers, e-book readers, MP3 players (Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3), MP4 players (Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4), laptop portable computers, desktop computers, and the like. When the terminal device 601, 602, 603 is software, it can be installed in the electronic devices listed above. It may be implemented as multiple pieces of software or software modules (e.g., software or software modules used to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
The server 606 may be a server providing various services, for example, receive an information acquisition request sent by the terminal devices 601, 602, and 603, and acquire, according to the information acquisition request, presentation information corresponding to the information acquisition request in various manners. And the relevant data of the presentation information is sent to the terminal devices 601, 602, 603.
It should be noted that the presentation method provided by the embodiment of the present disclosure may be executed by a terminal device, and accordingly, the presentation apparatus may be disposed in the terminal device 601, 602, 603. In addition, the display method provided by the embodiment of the disclosure can be further executed by the server 606, and accordingly, the display device can be disposed in the server 606.
It should be understood that the number of terminal devices, networks, and servers in fig. 6 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Referring now to fig. 7, shown is a schematic diagram of an electronic device (e.g., a terminal device or a server of fig. 5) suitable for use in implementing embodiments of the present disclosure. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 7, the electronic device may include a processing device (e.g., central processing unit, graphics processor, etc.) 701, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)702 or a program loaded from a storage device 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data necessary for the operation of the electronic apparatus 700 are also stored. The processing device 701, the ROM 702, and the RAM 703 are connected to each other by a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.
Generally, the following devices may be connected to the I/O interface 705: input devices 706 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 707 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 708 including, for example, magnetic tape, hard disk, etc.; and a communication device 709. The communication device 709 may allow the electronic device to communicate wirelessly or by wire with other devices to exchange data. While fig. 7 illustrates an electronic device having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such embodiments, the computer program may be downloaded and installed from a network via the communication means 709, or may be installed from the storage means 708, or may be installed from the ROM 702. The computer program, when executed by the processing device 701, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText transfer protocol), and may be interconnected with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: real house images are collected and displayed in real time; displaying a first augmented image at a target area of the displayed real-house image, wherein the first augmented image comprises house introduction information corresponding to the real-house image.
Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of a unit does not in some cases constitute a limitation of the unit itself, for example, the first presentation unit may also be described as a "unit presenting a real house image".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
Claims (11)
1. A method of displaying, comprising:
real house images are collected and displayed in real time;
displaying a first augmented image at a target area of the displayed real-house image, wherein the first augmented image comprises house introduction information corresponding to the real-house image.
2. The method of claim 1, wherein the target area is determined by a determining step, wherein the determining step comprises:
and carrying out image recognition on the acquired real house image, and determining a blank area obtained by image recognition as a target area.
3. The method of claim 1, wherein the target area is determined by a determining step, wherein the determining step comprises:
displaying area selection indication information, wherein the area selection indication information is used for indicating a user to select an area for displaying house introduction information;
and determining the area selected by the user as a target area.
4. A method according to claim 2 or 3, wherein the determining step comprises:
and in response to the fact that the display area of the target area in the real-time acquired real-time house image is smaller than a preset area threshold value, re-determining the target area.
5. The method of claim 1, wherein presenting the first augmented image at the target area of the presented real-world image comprises:
acquiring a first video obtained by rendering a derivative three-dimensional model, wherein the derivative three-dimensional model is obtained by combining the target information three-dimensional model and a house three-dimensional model corresponding to the real house image;
and displaying the first video.
6. The method of claim 5, wherein the derived three-dimensional model is generated based on a generating step comprising:
acquiring a house three-dimensional model corresponding to the displayed real house image;
determining a combination position in the house three-dimensional model according to a target area, wherein the combination position is the combination position of the house three-dimensional model and the target information three-dimensional model;
and combining the house three-dimensional model with the target information three-dimensional model at the combination position to obtain the derived three-dimensional model.
7. The method of claim 6, wherein the generating step comprises:
in response to re-determining the target region, updating the derived three-dimensional model based on the re-determined target region.
8. The method of claim 1, further comprising:
in response to detecting a delete operation for a first enhanced image, ceasing to present the first enhanced image.
9. A display device, comprising:
the first display unit is used for acquiring and displaying real house images in real time;
and the second display unit is used for displaying a first augmented image at the target area of the displayed real house image, wherein the first augmented image comprises house introduction information corresponding to the real house image.
10. An electronic device, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-8.
11. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010422376.7A CN111652675A (en) | 2020-05-18 | 2020-05-18 | Display method and device and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010422376.7A CN111652675A (en) | 2020-05-18 | 2020-05-18 | Display method and device and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111652675A true CN111652675A (en) | 2020-09-11 |
Family
ID=72348298
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010422376.7A Pending CN111652675A (en) | 2020-05-18 | 2020-05-18 | Display method and device and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111652675A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112749245A (en) * | 2021-01-07 | 2021-05-04 | 北京码牛科技有限公司 | One-standard three-real information management method and device and electronic equipment |
CN114049167A (en) * | 2021-10-13 | 2022-02-15 | 阿里巴巴(中国)有限公司 | Method and device for displaying commodity object information and electronic equipment |
CN114638951A (en) * | 2022-03-29 | 2022-06-17 | 北京有竹居网络技术有限公司 | House model display method and device, electronic equipment and readable storage medium |
WO2023098915A1 (en) * | 2021-12-03 | 2023-06-08 | 如你所视(北京)科技有限公司 | Method and apparatus for presenting content of three-dimensional house model |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106530404A (en) * | 2016-11-09 | 2017-03-22 | 大连文森特软件科技有限公司 | Inspection system of house for sale based on AR virtual reality technology and cloud storage |
CN106779900A (en) * | 2016-11-09 | 2017-05-31 | 大连文森特软件科技有限公司 | House for sale based on AR virtual reality technologies investigates system |
CN107993289A (en) * | 2017-12-06 | 2018-05-04 | 重庆欧派信息科技有限责任公司 | Finished system based on AR augmented realities |
CN108572969A (en) * | 2017-03-09 | 2018-09-25 | 阿里巴巴集团控股有限公司 | The method and device of geography information point recommended information is provided |
CN108958571A (en) * | 2017-05-24 | 2018-12-07 | 腾讯科技(深圳)有限公司 | Three-dimensional session data methods of exhibiting, device, storage medium and computer equipment |
CN110781263A (en) * | 2019-10-25 | 2020-02-11 | 北京无限光场科技有限公司 | House resource information display method and device, electronic equipment and computer storage medium |
CN111145352A (en) * | 2019-12-20 | 2020-05-12 | 北京乐新创展科技有限公司 | House live-action picture display method and device, terminal equipment and storage medium |
-
2020
- 2020-05-18 CN CN202010422376.7A patent/CN111652675A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106530404A (en) * | 2016-11-09 | 2017-03-22 | 大连文森特软件科技有限公司 | Inspection system of house for sale based on AR virtual reality technology and cloud storage |
CN106779900A (en) * | 2016-11-09 | 2017-05-31 | 大连文森特软件科技有限公司 | House for sale based on AR virtual reality technologies investigates system |
CN108572969A (en) * | 2017-03-09 | 2018-09-25 | 阿里巴巴集团控股有限公司 | The method and device of geography information point recommended information is provided |
CN108958571A (en) * | 2017-05-24 | 2018-12-07 | 腾讯科技(深圳)有限公司 | Three-dimensional session data methods of exhibiting, device, storage medium and computer equipment |
CN107993289A (en) * | 2017-12-06 | 2018-05-04 | 重庆欧派信息科技有限责任公司 | Finished system based on AR augmented realities |
CN110781263A (en) * | 2019-10-25 | 2020-02-11 | 北京无限光场科技有限公司 | House resource information display method and device, electronic equipment and computer storage medium |
CN111145352A (en) * | 2019-12-20 | 2020-05-12 | 北京乐新创展科技有限公司 | House live-action picture display method and device, terminal equipment and storage medium |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112749245A (en) * | 2021-01-07 | 2021-05-04 | 北京码牛科技有限公司 | One-standard three-real information management method and device and electronic equipment |
CN114049167A (en) * | 2021-10-13 | 2022-02-15 | 阿里巴巴(中国)有限公司 | Method and device for displaying commodity object information and electronic equipment |
WO2023098915A1 (en) * | 2021-12-03 | 2023-06-08 | 如你所视(北京)科技有限公司 | Method and apparatus for presenting content of three-dimensional house model |
CN114638951A (en) * | 2022-03-29 | 2022-06-17 | 北京有竹居网络技术有限公司 | House model display method and device, electronic equipment and readable storage medium |
CN114638951B (en) * | 2022-03-29 | 2023-08-15 | 北京有竹居网络技术有限公司 | House model display method and device, electronic equipment and readable storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106846497B (en) | Method and device for presenting three-dimensional map applied to terminal | |
CN111652675A (en) | Display method and device and electronic equipment | |
CN111399956A (en) | Content display method and device applied to display equipment and electronic equipment | |
WO2022007565A1 (en) | Image processing method and apparatus for augmented reality, electronic device and storage medium | |
CN111597466A (en) | Display method and device and electronic equipment | |
CN111597465A (en) | Display method and device and electronic equipment | |
CN111599022A (en) | House display method and device and electronic equipment | |
CN110456957B (en) | Display interaction method, device, equipment and storage medium | |
CN114461064A (en) | Virtual reality interaction method, device, equipment and storage medium | |
CN110134905B (en) | Page update display method, device, equipment and storage medium | |
CN111710017A (en) | Display method and device and electronic equipment | |
CN111710046A (en) | Interaction method and device and electronic equipment | |
CN110189364B (en) | Method and device for generating information, and target tracking method and device | |
CN114067030A (en) | Dynamic fluid effect processing method and device, electronic equipment and readable medium | |
CN112148744A (en) | Page display method and device, electronic equipment and computer readable medium | |
CN111597414B (en) | Display method and device and electronic equipment | |
CN110619615A (en) | Method and apparatus for processing image | |
US20230101652A1 (en) | Method, apparatus, device, medium and program product for replying questions | |
CN110069195A (en) | Image pulls deformation method and device | |
US20230334801A1 (en) | Facial model reconstruction method and apparatus, and medium and device | |
CN114417214A (en) | Information display method and device and electronic equipment | |
CN117311837A (en) | Visual positioning parameter updating method and device, electronic equipment and storage medium | |
CN111696214A (en) | House display method and device and electronic equipment | |
CN111460334A (en) | Information display method and device and electronic equipment | |
CN112306976A (en) | Information processing method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20230506 Address after: Room 802, Information Building, 13 Linyin North Street, Pinggu District, Beijing, 101299 Applicant after: Beijing youzhuju Network Technology Co.,Ltd. Address before: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing. Applicant before: BEIJING BYTEDANCE NETWORK TECHNOLOGY Co.,Ltd. |
|
TA01 | Transfer of patent application right |