Nothing Special   »   [go: up one dir, main page]

CN108235764B - Information processing method and device, cloud processing equipment and computer program product - Google Patents

Information processing method and device, cloud processing equipment and computer program product Download PDF

Info

Publication number
CN108235764B
CN108235764B CN201780002896.9A CN201780002896A CN108235764B CN 108235764 B CN108235764 B CN 108235764B CN 201780002896 A CN201780002896 A CN 201780002896A CN 108235764 B CN108235764 B CN 108235764B
Authority
CN
China
Prior art keywords
building
image information
information
model
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780002896.9A
Other languages
Chinese (zh)
Other versions
CN108235764A (en
Inventor
王恺
廉士国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cloudminds Robotics Co Ltd
Original Assignee
Cloudminds Shanghai Robotics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cloudminds Shanghai Robotics Co Ltd filed Critical Cloudminds Shanghai Robotics Co Ltd
Publication of CN108235764A publication Critical patent/CN108235764A/en
Application granted granted Critical
Publication of CN108235764B publication Critical patent/CN108235764B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/28Databases characterised by their database models, e.g. relational or object models
    • G06F16/284Relational databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Library & Information Science (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Remote Sensing (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The embodiment of the invention provides an information processing method and device, cloud processing equipment and a computer program product, relates to the technical field of data processing, and improves the identification precision and speed of static objects to a certain extent. The information processing method provided by the embodiment of the invention comprises the following steps: acquiring image information shot by a terminal; acquiring position information of a terminal; retrieving an AR model according to the position information and the image information; and sending the AR model to the terminal.

Description

Information processing method and device, cloud processing equipment and computer program product
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to an information processing method and apparatus, a cloud processing device, and a computer program product.
Background
The AR (Augmented Reality) technology is also called Augmented Reality, and superimposes entity information (visual information, sound, taste, touch, and the like) which is difficult to experience in a certain time and space range of the real world to the real world after simulation by a scientific technology, so that the entity information is perceived by human senses, thereby achieving the sensory experience beyond Reality.
And, through the AR technology, the interaction between static things and the user can also be enhanced, which can present different overlay scenes on the static things. However, in the prior art, since the image taken by the user may be only a part of the static object, a part of the content will be imaged to determine the whole static object, which results in a large search amount and further causes low accuracy in identifying the static object.
Disclosure of Invention
The embodiment of the invention provides an information processing method, an information processing device, cloud processing equipment and a computer program product, which improve the identification precision and speed of static objects and enable AR models to be more accurately superposed.
In a first aspect, an embodiment of the present invention provides an information processing method, including:
acquiring image information shot by a terminal;
acquiring the position information of the terminal;
retrieving an AR model according to the position information and the image information;
and sending the AR model to the terminal.
The above-mentioned aspects and any possible implementations further provide an implementation where the retrieving an AR model according to the location information and the image information includes:
determining a retrieval range according to the position information;
retrieving all building information within the retrieval range;
acquiring image information corresponding to each building in the building information;
determining a building contained in the image information according to the image information;
an AR model corresponding to the building is retrieved.
The above-described aspect and any possible implementation manner further provide an implementation manner, where the obtaining of the image information corresponding to each building in the building information includes:
acquiring 3D model information corresponding to each building in the building information; or,
and if the 3D model information corresponding to the building information is not acquired, acquiring 2D picture information corresponding to each building in the building information.
As with the above-described aspects and any possible implementations, there is further provided an implementation, where the method further includes:
and establishing at least one relational database according to the building information, the image information and the AR model.
In a second aspect, an embodiment of the present invention further provides an information processing apparatus, including:
the first acquisition unit is used for acquiring image information shot by the terminal;
a second obtaining unit, configured to obtain location information of the terminal;
the retrieval unit is used for retrieving an AR model according to the position information and the image information;
and the sending unit is used for sending the AR model to the terminal.
The above-mentioned aspect and any possible implementation manner further provide an implementation manner, where the retrieving unit is specifically configured to:
determining a retrieval range according to the position information;
retrieving building information within the retrieval range;
acquiring image information corresponding to each building in the building information;
determining a building contained in the image information according to the image information;
an AR model corresponding to the building is retrieved.
The above-mentioned aspect and any possible implementation manner further provide an implementation manner, where the retrieving unit is specifically configured to:
acquiring 3D model information corresponding to each building in the building information; or,
and if the 3D model information corresponding to the building information is not acquired, acquiring 2D picture information corresponding to each building in the building information.
The above-described aspects and any possible implementations further provide an implementation, where the apparatus further includes:
and the establishing unit is used for establishing at least one relational database according to the building information, the image information and the AR model.
In a third aspect, an embodiment of the present invention further provides a cloud processing device, where the device includes a processor and a memory; the memory is for storing instructions that, when executed by the processor, cause the apparatus to perform the method of any of the first aspects.
In a fourth aspect, embodiments of the present invention further provide a computer program product, which is directly loadable into an internal memory of a computer and contains software codes, and which, when loaded and executed by the computer, is able to implement the method according to any one of the first aspect.
According to the information processing method, the information processing device, the cloud processing equipment and the computer program product, the image information and the position information which are shot by the terminal are obtained, and then the corresponding AR model is searched by combining the position information and the image information, wherein the search range can be narrowed by searching the AR model according to the position information, the search precision is improved, the search efficiency can be improved by determining the corresponding AR model by combining the image information which is shot by the terminal, the identification rate of the static object is improved, and the problems that the search amount for identifying the static object is large and the accuracy is low in the prior art are solved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a flowchart of an embodiment of an image processing method according to the present invention;
FIG. 2 is another flowchart of an embodiment of an image processing method according to the present invention;
FIG. 3 is a schematic structural diagram of an information processing apparatus according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of another structure of an embodiment of an information processing apparatus according to the present invention;
fig. 5 is a schematic structural diagram of an embodiment of a cloud processing device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the examples of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be understood that the term "and/or" as used herein is merely one type of association that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
When a user uses a terminal or a wearable device, the user often uses a camera to shoot videos or images, and when the user is in an outdoor environment, static things such as buildings often appear in the camera, or the user actively shoots the static things such as the buildings. Since various information such as service information, advertisement information, and publicity information can be attached to the surface of the outdoor building, but all have the disadvantages of time and labor consuming and high cost for replacing the content, the AR technology can be integrated into the outdoor building by means of the aforementioned usage habits of the user using the terminal or the wearable device, thereby enhancing the effects of customization, low cost, strong immersion feeling, and the like.
However, since the outdoor building is subjected to illumination change to acquire the image, and when the building is large in size, the acquired image is only a part of the building, the accuracy of building identification is reduced to some extent. In order to solve the foregoing problem, an embodiment of the present invention provides an information processing method, and specifically, fig. 1 is a flowchart of an embodiment of an image processing method according to an embodiment of the present invention, and as shown in fig. 1, the information processing method according to an embodiment of the present invention may specifically include the following steps:
101. and acquiring image information shot by the terminal.
In the embodiment of the present invention, the terminal may include: the mobile phone comprises intelligent equipment with a camera shooting function, such as a mobile phone, a tablet personal computer, a notebook computer, vehicle-mounted equipment, wearable equipment and the like. In addition, in the embodiment of the present invention, the terminal further needs to have a network communication function.
The image information shot by the terminal is obtained by shooting the outdoor environment by a device with an image acquisition function, such as a user control camera. The image information may be acquired by a computing device with high computing power, and the computing device at least includes an arithmetic unit and a wireless transmission unit, for example, a local computer, a cloud processing center, and the like. The terminal and the computing device can communicate with each other, and wireless communication methods such as 2G, 3G, 4G, WiFi and the like can be used for communication.
In the embodiment of the invention, the terminal can actively upload the shot image information to the computing equipment and receive the image information by the computing equipment, or the computing equipment can actively acquire the image information shot by the terminal.
102. And acquiring the position information of the terminal.
In the embodiment of the present invention, the location information of the terminal may be located, and specifically, in a specific implementation process, a Global Positioning System (GPS) may be used to determine the longitude and latitude of the terminal, and the terminal is located by combining with map information. In another specific implementation process, the location of the terminal may also be determined by the base station information, and the terminal reports the identifier of the base station to which the terminal currently belongs and the signal strength to the locating device, so as to determine the coordinates of the current location of the terminal.
In the embodiment of the invention, the terminal can actively upload the position information of the terminal to the computing equipment and receive the position information by the computing equipment, or the computing equipment can actively acquire the information of the terminal to position the terminal to obtain the position information. By acquiring the position information of the terminal, the position of the terminal can be determined more accurately, the difficulty in subsequent image information identification is reduced to the greatest extent, and a foundation is established for improving the accuracy of image information identification.
103. And retrieving the AR model according to the position information and the image information.
In the embodiment of the present invention, since the position of the terminal may change and the obtained position information may have a certain error, in order to improve accuracy, first, a search range may be determined according to the position information. In one particular implementation, this may be an area of a certain range from the location information, for example, a planar area 50 meters from the location position.
Then, the building information within the search range is retrieved. In this process, information of the building location may be determined depending on map information or information provided by a building provider, etc., and as a result of the search, several buildings and information of the name of each building, the number of each building, etc. are included in the search range.
Next, image information corresponding to each building in the building information is acquired. In the embodiment of the present invention, the image information includes 3D model information and 2D picture information, and the image information may be stored in a database in advance. The 3D model information may be provided by a designer of the building, or may be obtained by scanning and reconstructing an appearance of the building using a camera or the like by a user, and generally includes a complete three-dimensional shape of the building. The 2D picture information may be collected by a user and uploaded to a computing device, and in general, may be the overall appearance of a building or may be a partial view of a building. In the embodiment of the invention, no matter the 3D model information and the 2D picture information are, the 3D model information and the 2D picture information need to be labeled manually, and the area or the position of the superposed AR model is determined. Therefore, to acquire image information corresponding to building information, it is necessary to determine 3D model information of a building from the name or number of the building or 2D picture information of the building from the name or number of the building. In the embodiment of the invention, the image information corresponding to the building is acquired for comparing with the shooting information acquired by the user, and the building shot by the user is determined.
Then, determining a building contained in the image information according to the image information; in the embodiment of the invention, whether the main body in the video information is the same as the image information corresponding to the building acquired before is determined by using the image contrast method, and if the main body in the video information is the same as the image information, which building is included in the video information can be determined. Specifically, feature points of the main content in the image information and feature points of the image information are respectively extracted, the feature points of the main content and the feature points of the image information are compared, and a display error of a comparison result is within a certain range, so that a building corresponding to the main content in the image information can be understood to be the same as a building represented by the image information.
Finally, an AR model corresponding to the building is retrieved. In the embodiment of the present invention, different AR models, such as pictures, videos, 3D models, animations, etc., may be stored in the cloud processing device in advance. Accordingly, different AR models correspond to different building image information, which may be matched or marked manually. When the building is determined, the corresponding AR model can be obtained by direct search.
It should be noted that, in the embodiment of the present invention, when acquiring image information corresponding to each building in the building information, 3D model information corresponding to each building in the building information is preferentially acquired; and if the 3D model information corresponding to the building information is not acquired, acquiring 2D picture information corresponding to each building in the building information. The reason why the 3D model is preferably used is that the 3D model is a complete shape of the building, and can be operated more accurately when the AR model is superimposed, and the 3D model is more robust to changes in local details and changes in illumination, and is advantageous for calculating the pose subsequently. When there is no 3D model, 2D pictures are used, which is advantageous in that the amount of data is large and the picture sources are wide.
104. And sending the AR model to the terminal.
In the embodiment of the invention, after the AR model is sent to the terminal, the terminal starts the superposition operation according to the received AR model, and in order to accurately superpose the AR model to the image information, the relative pose between the building in the image information and the building in the image information is firstly calculated. Specifically, in a specific implementation process, feature points of a building in image information are extracted first, then feature points of the building in image information are extracted, then a relative position relationship of each feature point is calculated according to a feature point matching principle, and a relative pose of the building in the image information is determined according to the relative position relationship.
And then, the AR model is superposed into the image information according to the relative pose to form superposed image information, so that the pose of the AR is the same as the position of the building in the image information. And displaying the superposed image information in the display unit.
According to the information processing method provided by the embodiment of the invention, the image information and the position information shot by the terminal are obtained, and then the corresponding AR model is searched by combining the position information and the image information, wherein the search range can be reduced by searching the AR model according to the position information, the search precision is improved, the search efficiency can be improved by determining the corresponding AR model by combining the image information shot by the terminal, the identification rate of static objects is improved, and the problems of large search quantity and low accuracy in static object identification in the prior art are solved.
When a user holds the terminal by hand or walks, the position or the angle of the terminal is not fixed, and correspondingly, the image information shot by the terminal is also not fixed, so that the immersion feeling can be enhanced and the vision is smoother, on the basis of the above contents, the terminal can also collect inertial measurement data, determine the moving pose of the terminal according to the inertial measurement data, and further adjust the position of the AR model. In the embodiment of the present invention, the mobile pose of the terminal may be calculated by using data generated by an Inertial Measurement Unit (IMU) when the position or the angle of the terminal changes, specifically, by collecting data such as an acceleration change value and an angular velocity change value of the IMU in the terminal, and then calculating a position change, that is, a mobile pose, of the terminal according to the data value. And then the position of the AR model in the superposed image information is updated according to the moving pose.
Therefore, the technical scheme provided by the embodiment of the invention realizes that the AR model moves along with the movement of the image information shot by the user, improves the timeliness, is more smooth visually, and enhances the interaction with the user.
In the embodiment of the present invention, a large amount of data is stored in the computing device, and therefore, in order to improve the speed and the accuracy of the retrieval, a relational database may be further established in advance, specifically, as shown in fig. 2, another flowchart of an embodiment of an image processing method according to the embodiment of the present invention is shown in fig. 2, before step 101, the information processing method according to the embodiment of the present invention may further include the following steps:
100. and establishing at least one relational database according to the building information, the image information and the AR model.
In the embodiment of the invention, the image information can be determined for each building depending on the map information and the image information uploaded by the user through various ways, the image information is labeled, the position of AR model superposition or the area of AR model superposition is determined, and a complete relational database about the building information, the image information and the AR model is established or a relational database is respectively established by matching the corresponding AR model for each building.
Moreover, the content of the database can be changed continuously, and the adjustment can be carried out according to the actual requirement, so as to increase, delete, modify and the like the content.
The database is established for improving the retrieval precision and efficiency, so that the AR model can be more accurately superposed in the image information shot by the user.
In order to implement the method in the foregoing, an embodiment of the present invention further provides an information processing apparatus, fig. 3 is a schematic structural diagram of an embodiment of the information processing apparatus provided in the embodiment of the present invention, and as shown in fig. 3, the apparatus of the embodiment may include: a first acquisition unit 11, a second acquisition unit 12, a retrieval unit 13 and a transmission unit.
A first obtaining unit 11, configured to obtain image information captured by a terminal;
a second obtaining unit 12, configured to obtain location information of a terminal;
a search unit 13 for searching the AR model based on the position information and the image information;
a sending unit 14, configured to send the AR model to the terminal.
In a specific implementation process, the retrieving unit 13 is specifically configured to:
determining a retrieval range according to the position information;
retrieving building information within the retrieval range;
acquiring image information corresponding to each building in the building information;
determining a building contained in the image information according to the image information;
an AR model corresponding to the building is retrieved.
In a specific implementation process, the obtaining image information corresponding to each building in the building information includes:
acquiring 3D model information corresponding to each building in the building information; or,
and if the 3D model information corresponding to the building information is not acquired, acquiring 2D picture information corresponding to each building in the building information.
The information processing apparatus provided in the embodiment of the present invention may be used to implement the technical solution of the method embodiment shown in fig. 1, and the implementation principle and the technical effect are similar, which are not described herein again.
Further, an information processing apparatus is further provided in an embodiment of the present invention, fig. 4 is another schematic structural diagram of the information processing apparatus provided in the embodiment of the present invention, and as shown in fig. 4, the apparatus of the embodiment may further include, on the basis of the foregoing content: a building unit 15.
And the establishing unit 15 is used for establishing at least one relational database according to the building information, the image information and the AR model.
The information processing apparatus provided in the embodiment of the present invention may be used to execute the technical solution of the method embodiment shown in fig. 2, and the implementation principle and the technical effect are similar, which are not described herein again.
Fig. 5 is a schematic structural diagram of an embodiment of a cloud processing device provided in an embodiment of the present invention, and as shown in fig. 5, the cloud processing device provided in the embodiment of the present invention may specifically include: a processor 21 and a memory 22.
Wherein the memory 21 is configured to store instructions that, when executed by the processor 22, cause the apparatus to perform any of the methods shown in fig. 1 or fig. 2.
The cloud processing device provided in the embodiment of the present invention may be configured to execute the technical solution of the method embodiment shown in fig. 1 or fig. 2, and the implementation principle and the technical effect are similar, which are not described herein again.
The embodiment of the present invention further provides a computer program product, which can be directly loaded into an internal memory of a computer and contains software codes, and after the computer program is loaded and executed by the computer, the method shown in fig. 1 or fig. 2 can be implemented.
The computer program product provided by the embodiment of the present invention may be used to execute the technical solution of the method embodiment shown in fig. 1 or fig. 2, and the implementation principle and the technical effect are similar, which are not described herein again.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present invention, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one type of logical functional division, and other divisions may be realized in practice, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a Processor (Processor) to execute some steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (8)

1. An information processing method characterized by comprising:
acquiring image information shot by a terminal;
acquiring the position information of the terminal;
determining a retrieval range according to the position information, and retrieving all building information in the retrieval range;
acquiring image information corresponding to each building in the building information;
determining a building contained in the image information according to the image information;
retrieving an AR model corresponding to the building;
sending the AR model to the terminal;
superimposing the AR model to the image information;
wherein, according to the image information, determining the buildings contained in the image information comprises:
respectively extracting the characteristic points of the building and the characteristic points of the image information in the image information;
comparing the characteristic points of the building in the image information with the characteristic points of the image information;
if the difference of the characteristic points is within a preset error range, judging that the building in the image information is the same as the building represented by the image information;
wherein, superimposing the AR model into the image information includes:
calculating the relative poses of the buildings in the image information and the buildings in the image information;
superposing the AR model into the image information according to the relative pose to form superposed image information, so that the pose of the AR is the same as the position of the building in the image information;
wherein the calculating the relative pose of the building in the image information and the building in the image information comprises:
extracting the characteristic points of the buildings in the image information, and then extracting the characteristic points of the buildings in the image information;
and respectively calculating the relative position relation of each feature point according to a feature point matching principle, and determining the relative pose of the building in the image information according to the relative position relation.
2. The method of claim 1, wherein the obtaining image information corresponding to each building in the building information comprises:
acquiring 3D model information corresponding to each building in the building information; or,
and if the 3D model information corresponding to the building information is not acquired, acquiring 2D picture information corresponding to each building in the building information.
3. The method according to claim 1 or 2, characterized in that the method further comprises:
and establishing at least one relational database according to the building information, the image information and the AR model.
4. An information processing apparatus characterized by comprising:
the first acquisition unit is used for acquiring image information shot by the terminal;
a second obtaining unit, configured to obtain location information of the terminal;
a retrieval unit configured to determine a retrieval range according to the position information, retrieve building information within the retrieval range, acquire image information corresponding to each building in the building information, determine a building included in the image information according to the image information, and retrieve an AR model corresponding to the building;
a sending unit, configured to send the AR model to the terminal;
the information processing apparatus is further configured to superimpose the AR model into the image information, wherein,
the information processing device is specifically used for calculating the relative poses of the buildings in the image information and the buildings in the image information; superposing the AR model into the image information according to the relative pose to form superposed image information, so that the pose of the AR is the same as the position of the building in the image information;
the retrieval unit is further configured to extract feature points of a building and feature points of image information in image information, compare the feature points of the building and the feature points of the image information in the image information, and determine that the building in the image information is the same as the building represented by the image information if the difference between the feature points is within a preset error range;
wherein the calculating the relative pose of the building in the image information and the building in the image information includes:
extracting the characteristic points of the building in the image information, and then extracting the characteristic points of the building in the image information;
and respectively calculating the relative position relation of each feature point according to a feature point matching principle, and determining the relative pose of the building in the image information according to the relative position relation.
5. The apparatus of claim 4, wherein the obtaining image information corresponding to each building in the building information comprises:
acquiring 3D model information corresponding to each building in the building information; or,
and if the 3D model information corresponding to the building information is not acquired, acquiring 2D picture information corresponding to each building in the building information.
6. The apparatus of claim 4 or 5, further comprising:
and the establishing unit is used for establishing at least one relational database according to the building information, the image information and the AR model.
7. A cloud processing device, the device comprising a processor and a memory; the memory is configured to store instructions that, when executed by the processor, cause the apparatus to perform the method of any of claims 1-3.
8. A computer storage medium directly loadable into the internal memory of a computer and containing software code, which computer program is loadable through the computer and executable to perform the method according to any of claims 1-3.
CN201780002896.9A 2017-12-29 2017-12-29 Information processing method and device, cloud processing equipment and computer program product Active CN108235764B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/119708 WO2019127320A1 (en) 2017-12-29 2017-12-29 Information processing method and apparatus, cloud processing device, and computer program product

Publications (2)

Publication Number Publication Date
CN108235764A CN108235764A (en) 2018-06-29
CN108235764B true CN108235764B (en) 2022-09-16

Family

ID=62645531

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780002896.9A Active CN108235764B (en) 2017-12-29 2017-12-29 Information processing method and device, cloud processing equipment and computer program product

Country Status (2)

Country Link
CN (1) CN108235764B (en)
WO (1) WO2019127320A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109545003B (en) * 2018-12-24 2022-05-03 北京卡路里信息技术有限公司 Display method, display device, terminal equipment and storage medium
CN109886191A (en) * 2019-02-20 2019-06-14 上海昊沧系统控制技术有限责任公司 A kind of identification property management reason method and system based on AR

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9639959B2 (en) * 2012-01-26 2017-05-02 Qualcomm Incorporated Mobile device configured to compute 3D models based on motion sensor data
CN103107938A (en) * 2013-01-25 2013-05-15 腾讯科技(深圳)有限公司 Information interactive method, server and terminal
CN103489002B (en) * 2013-09-27 2017-03-29 广州中国科学院软件应用技术研究所 A kind of augmented reality method and system
US9147113B2 (en) * 2013-10-07 2015-09-29 Hong Kong Applied Science and Technology Research Institute Company Limited Deformable surface tracking in augmented reality applications
US9652896B1 (en) * 2015-10-30 2017-05-16 Snap Inc. Image based tracking in augmented reality systems
CN106101574B (en) * 2016-06-28 2019-04-23 Oppo广东移动通信有限公司 A kind of control method, device and the mobile terminal of image enhancement reality
CN106354869A (en) * 2016-09-13 2017-01-25 四川研宝科技有限公司 Real-scene image processing method and server based on location information and time periods
CN106446098A (en) * 2016-09-13 2017-02-22 四川研宝科技有限公司 Live action image processing method and server based on location information
CN106373198A (en) * 2016-09-18 2017-02-01 福州大学 Method for realizing augmented reality
CN106529452B (en) * 2016-11-04 2019-04-23 重庆市勘测院 Mobile intelligent terminal building method for quickly identifying based on building threedimensional model

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
视频透射式增强现实系统;陈金华;《智慧学习环境构建》;20130930;第5.3 *

Also Published As

Publication number Publication date
WO2019127320A1 (en) 2019-07-04
CN108235764A (en) 2018-06-29

Similar Documents

Publication Publication Date Title
US11393173B2 (en) Mobile augmented reality system
US10380410B2 (en) Apparatus and method for image-based positioning, orientation and situational awareness
EP2915140B1 (en) Fast initialization for monocular visual slam
US10225506B2 (en) Information processing apparatus and information processing method
US10462406B2 (en) Information processing apparatus and information processing method
US8872851B2 (en) Augmenting image data based on related 3D point cloud data
US9934222B2 (en) Providing a thumbnail image that follows a main image
US20120113145A1 (en) Augmented reality surveillance and rescue system
CN106233371A (en) Select the panoramic picture for the Annual distribution shown
WO2013055980A1 (en) Method, system, and computer program product for obtaining images to enhance imagery coverage
CN105023266A (en) Method and device for implementing augmented reality (AR) and terminal device
TW201823983A (en) Method and system for creating virtual message onto a moving object and searching the same
CN107084740B (en) Navigation method and device
US20200387711A1 (en) Indoor augmented reality information display method
CN112330819A (en) Interaction method and device based on virtual article and storage medium
CN115731370A (en) Large-scene element universe space superposition method and device
CN114185073A (en) Pose display method, device and system
CN108235764B (en) Information processing method and device, cloud processing equipment and computer program product
TW201823929A (en) Method and system for remote management of virtual message for a moving object
EP3007136B1 (en) Apparatus and method for generating an augmented reality representation of an acquired image
US10108882B1 (en) Method to post and access information onto a map through pictures
WO2018094289A1 (en) Remote placement of digital content to facilitate augmented reality system
US10878278B1 (en) Geo-localization based on remotely sensed visual features
CN111882675A (en) Model presentation method and device, electronic equipment and computer storage medium
CN109074356A (en) System and method for being optionally incorporated into image in low bandwidth digital map database

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210224

Address after: 201111 2nd floor, building 2, no.1508, Kunyang Road, Minhang District, Shanghai

Applicant after: Dalu Robot Co.,Ltd.

Address before: 518000 Room 201, building A, No. 1, Qian Wan Road, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen, Guangdong (Shenzhen Qianhai business secretary Co., Ltd.)

Applicant before: CLOUDMINDS (SHENZHEN) ROBOTICS SYSTEMS Co.,Ltd.

CB02 Change of applicant information
CB02 Change of applicant information

Address after: 201111 Building 8, No. 207, Zhongqing Road, Minhang District, Shanghai

Applicant after: Dayu robot Co.,Ltd.

Address before: 201111 2nd floor, building 2, no.1508, Kunyang Road, Minhang District, Shanghai

Applicant before: Dalu Robot Co.,Ltd.

GR01 Patent grant
GR01 Patent grant