Nothing Special   »   [go: up one dir, main page]

CN107608541B - Three-dimensional attitude positioning method and device and electronic equipment - Google Patents

Three-dimensional attitude positioning method and device and electronic equipment Download PDF

Info

Publication number
CN107608541B
CN107608541B CN201710961392.1A CN201710961392A CN107608541B CN 107608541 B CN107608541 B CN 107608541B CN 201710961392 A CN201710961392 A CN 201710961392A CN 107608541 B CN107608541 B CN 107608541B
Authority
CN
China
Prior art keywords
image
coordinates
preset
flying mouse
ordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710961392.1A
Other languages
Chinese (zh)
Other versions
CN107608541A (en
Inventor
任仲超
陆小松
张涛
蒲天发
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Shiruidi Photoelectric Co ltd
Ningbo Thredim Optoelectronics Co ltd
Original Assignee
Jiangsu Thredim Photoelectric Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Thredim Photoelectric Co ltd filed Critical Jiangsu Thredim Photoelectric Co ltd
Priority to CN201710961392.1A priority Critical patent/CN107608541B/en
Publication of CN107608541A publication Critical patent/CN107608541A/en
Application granted granted Critical
Publication of CN107608541B publication Critical patent/CN107608541B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a three-dimensional attitude positioning method and device and electronic equipment, and relates to the technical field of computers. The three-dimensional attitude positioning method comprises the following steps: acquiring a plurality of first position data which are respectively corresponding to a flying mouse at a plurality of preset positions and are measured by a sensor of the flying mouse internally provided with an infrared diode, wherein the first position data comprise attitude angle data and a first plane coordinate; obtaining a plurality of second position coordinates corresponding to the flying mouse at a plurality of preset positions respectively according to position images of the flying mouse at the plurality of preset positions, which are acquired by a binocular CMOS camera provided with an infrared filter; obtaining a plurality of first position coordinates corresponding to a plurality of first position coordinates and a plurality of second position coordinates based on a first preset algorithm; and generating second position data corresponding to the flying mouse at a plurality of preset positions respectively based on the plurality of first position coordinates and the plurality of attitude angle data. The three-dimensional attitude positioning method, the three-dimensional attitude positioning device and the electronic equipment can better acquire the attitude data of the flying mouse under the dark condition.

Description

Three-dimensional attitude positioning method and device and electronic equipment
Technical Field
The invention relates to the technical field of computers, in particular to a three-dimensional attitude positioning method and device and electronic equipment.
Background
At present, under a dark condition, a positioning sensor in a wireless aerial volleyball can obtain the horizontal coordinate, the longitudinal coordinate and attitude angle data of the volleyball in three directions in a designated plane, but cannot obtain the vertical coordinate, so that the three-dimensional attitude positioning data of the volleyball cannot be obtained under the dark condition.
Disclosure of Invention
In view of this, embodiments of the present invention provide a method and an apparatus for three-dimensional pose positioning, and an electronic device, so as to solve the above problems.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
a method of three-dimensional pose localization, the method comprising: acquiring a plurality of first position data which are respectively corresponding to a plurality of preset positions of a flying mouse and are measured by a sensor of the flying mouse internally provided with an infrared diode, wherein the first position data comprise attitude angle data and a first plane coordinate; obtaining a plurality of first space coordinates respectively corresponding to the flying mouse at the plurality of preset positions according to position images of the flying mouse at the plurality of preset positions, which are acquired by a binocular CMOS camera provided with an infrared filter; obtaining a plurality of first plane coordinates and a plurality of first position coordinates corresponding to the first space coordinates based on a first preset algorithm; and generating second position data corresponding to the flying mouse at the preset positions respectively based on the first position coordinates and the attitude angle data.
A three-dimensional pose positioning apparatus, the apparatus comprising: the device comprises a first data acquisition module, a second data acquisition module, a processing module and a data generation module, wherein the first data acquisition module is used for acquiring a plurality of first position data which are respectively corresponding to a plurality of preset positions of a flying mouse measured by a sensor of the flying mouse with an infrared diode arranged inside, and the first position data comprise attitude angle data and first plane coordinates; the second data acquisition module is used for acquiring a plurality of first space coordinates corresponding to the flying mouse at the plurality of preset positions respectively according to position images of the flying mouse at the plurality of preset positions, which are acquired by a binocular CMOS camera provided with an infrared filter; the processing module is used for obtaining a plurality of first plane coordinates and a plurality of first position coordinates corresponding to the first space coordinates based on a first preset algorithm; the data generation module is used for generating second position data corresponding to the flying mouse at the plurality of preset positions respectively based on the plurality of first position coordinates and the plurality of attitude angle data.
An electronic device comprising a memory and a processor, the memory coupled to the processor, the memory storing instructions that, when executed by the processor, cause the processor to: acquiring a plurality of first position data which are respectively corresponding to a plurality of preset positions of a flying mouse and are measured by a sensor of the flying mouse internally provided with an infrared diode, wherein the first position data comprise attitude angle data and a first plane coordinate; obtaining a plurality of first space coordinates respectively corresponding to the flying mouse at the plurality of preset positions according to position images of the flying mouse at the plurality of preset positions, which are acquired by a binocular CMOS camera provided with an infrared filter; obtaining a plurality of first plane coordinates and a plurality of first position coordinates corresponding to the first space coordinates based on a first preset algorithm; and generating second position data corresponding to the flying mouse at the preset positions respectively based on the first position coordinates and the attitude angle data.
According to the three-dimensional attitude positioning method, the three-dimensional attitude positioning device and the electronic equipment, a plurality of first position data which respectively correspond to a flying mouse at a plurality of preset positions and are measured by a sensor of the flying mouse internally provided with an infrared diode are obtained, wherein the first position data comprise attitude angle data and first plane coordinates; then, according to position images of the flying mouse at a plurality of preset positions, which are acquired by a binocular CMOS camera provided with an infrared filter, a plurality of first space coordinates, which correspond to the flying mouse at the plurality of preset positions respectively, are obtained; obtaining a plurality of first plane coordinates and a plurality of first position coordinates corresponding to the plurality of first space coordinates based on a first preset algorithm; and finally, generating second position data corresponding to the volleyball at a plurality of preset positions respectively based on the plurality of first position coordinates and the plurality of attitude angle data. Therefore, under the dark condition, the three-dimensional attitude data corresponding to the flying mouse can be obtained, and the problem that the three-dimensional attitude positioning data of the flying mouse cannot be obtained under the dark condition is solved.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
FIG. 1 is a block diagram of an electronic device provided by an embodiment of the invention;
FIG. 2 is a flow chart of a three-dimensional pose location method provided by an embodiment of the invention;
FIG. 3 is a flowchart illustrating step S120 of a three-dimensional pose location method according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating a preset model provided by an embodiment of the present invention;
FIG. 5 is a flowchart illustrating step S130 of a three-dimensional pose location method according to an embodiment of the present invention;
FIG. 6 is a block diagram of a three-dimensional pose positioning apparatus provided by an embodiment of the present invention;
FIG. 7 is a block diagram of a processing module of the three-dimensional pose positioning apparatus provided by the embodiments of the present invention;
fig. 8 is a block diagram of a second data acquisition module of the three-dimensional attitude determination apparatus according to the embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present invention, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
Fig. 1 shows a block diagram of an electronic device applicable to an embodiment of the present invention. As shown in FIG. 1, electronic device 100 includes a memory 102, a memory controller 104, one or more processors 106 (only one shown), a peripherals interface 108, a radio frequency module 110, an audio module 112, a display unit 114, and the like. These components communicate with each other via one or more communication buses/signal lines 116.
The memory 102 may be used to store software programs and modules, such as program instructions/modules corresponding to the three-dimensional pose positioning method and apparatus in the embodiment of the present invention, and the processor 106 executes various functional applications and data processing by running the software programs and modules stored in the memory 102, such as the three-dimensional pose positioning apparatus provided in the embodiment of the present invention.
The memory 102 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. Access to the memory 102 by the processor 106, and possibly other components, may be under the control of the memory controller 104.
The peripheral interface 108 couples various input/output devices to the processor 106 as well as to the memory 102. In some embodiments, the peripheral interface 108, the processor 106, and the memory controller 104 may be implemented in a single chip. In other examples, they may be implemented separately from the individual chips.
The rf module 110 is used for receiving and transmitting electromagnetic waves, and implementing interconversion between the electromagnetic waves and electrical signals, so as to communicate with a communication network or other devices.
Audio module 112 provides an audio interface to a user that may include one or more microphones, one or more speakers, and audio circuitry.
The display unit 114 provides a display interface between the electronic device 100 and a user. In particular, display unit 114 displays video output to the user, the content of which may include text, graphics, video, and any combination thereof.
It will be appreciated that the configuration shown in FIG. 1 is merely illustrative and that electronic device 100 may include more or fewer components than shown in FIG. 1 or have a different configuration than shown in FIG. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination thereof.
First embodiment
Fig. 2 shows a flowchart of a three-dimensional pose positioning method according to an embodiment of the present invention. Referring to fig. 2, the method includes:
step S110: the method comprises the steps of obtaining a plurality of first position data which are respectively corresponding to a plurality of preset positions of a flying mouse and are measured by a sensor of the flying mouse with an infrared diode arranged inside, wherein the first position data comprise attitude angle data and first plane coordinates.
In the process of the movement of the flying rat, the gyroscope sensor in the flying rat can acquire the horizontal, vertical and vertical attitude angle data of the flying rat and the horizontal and vertical coordinate data of the flying rat.
In the embodiment of the invention, because the three-dimensional attitude data of the flying mouse cannot be obtained under the dark condition, an infrared diode can be arranged inside the flying mouse so as to be convenient for capturing the image of the flying mouse to obtain the subsequent three-dimensional attitude data. In addition, the shape of the flying mouse may be spherical. The flying mouse can move at a position opposite to the screen, and a plurality of preset positions can be set. And the flying mouse can be in communication connection with the electronic equipment for realizing the three-dimensional attitude positioning method. The preset positions are positions of the flying mouse in the movement process, and specifically, the positions of the flying mouse corresponding to the preset time points can be used as the preset positions. For example, the movement process of the flying mouse in a certain period of time is divided into a plurality of time points such as time point 1, time point 2, time point 3, etc., and the positions of the flying mouse corresponding to the plurality of time points such as time point 1, time point 2, time point 3, etc. are the preset positions of the flying mouse.
Thus, a plurality of first position data corresponding to the flying mouse measured by the flying mouse sensor at a plurality of preset positions can be acquired. And the first position data comprises attitude angles in the transverse direction, the longitudinal direction and the vertical direction and first plane coordinates formed by the transverse coordinate and the longitudinal coordinate.
Step S120: the method comprises the steps that a plurality of first space coordinates corresponding to a plurality of preset positions of the flying mouse are obtained according to position images of the flying mouse at the preset positions, wherein the position images are acquired by a binocular CMOS camera provided with an infrared filter.
In the embodiment of the invention, the binocular CMOS camera can be arranged in the screen opposite to the flying mouse, and because the image of the flying mouse needs to be captured, the infrared filter can be arranged on the binocular CMOS camera. Preferably, the binocular CMOS camera may be disposed at the top of the screen.
Therefore, in the process of the movement of the flying mouse, the binocular CMOS camera can be controlled to acquire the image of the position of the flying mouse. Specifically, the binocular CMOS camera can be controlled to acquire the position image of the flying mouse at the preset position.
Then, a plurality of first space coordinates respectively corresponding to the flying mouse at a plurality of preset positions can be obtained according to the position images of the flying mouse at the plurality of preset positions, which are acquired by the binocular CMOS camera.
Specifically, referring to fig. 3, obtaining a plurality of first spatial coordinates corresponding to the flying mouse at a plurality of preset positions according to position images of the flying mouse at the plurality of preset positions acquired by the binocular CMOS camera may include:
step S121: and acquiring a first image and a second image of the flying mouse at a preset position, wherein the first image and the second image are respectively acquired by a first CMOS camera and a second CMOS camera of the binocular CMOS camera.
Specifically, step S121 may include: controlling the first CMOS camera and the second CMOS camera to acquire a third image and a fourth image of the flying mouse; acquiring a first frame selection image and a second frame selection image which respectively correspond to the third image and the fourth image, wherein the first frame selection image and the second frame selection image are images corresponding to frame selection areas corresponding to the flying mouse; and when the images of the flying mouse at the preset position, which are respectively acquired by the first CMOS camera and the second CMOS camera, are respectively matched with the first frame selection image and the second frame selection image, the image acquired by the first CMOS camera is used as a first image, and the image acquired by the second CMOS camera is used as a second image.
First, the first CMOS camera and the second CMOS camera can be controlled to respectively acquire a third image and a fourth image of a flying mouse in which an infrared diode is disposed. Therefore, the third image and the fourth image corresponding to the flying mouse can be acquired. The third image and the fourth image may then be obtained from an RGB color space to a HIS color space corresponding image. Because the infrared diode is arranged in the collected flying mouse, and the binocular CMOS camera is provided with the infrared filter, the color of the flying mouse in the captured image is mainly red. In order to extract the characteristics of the mouse, an H component can be extracted from the image corresponding to the HIS color space, and the value of the H component is 0-180. It should be noted that the range of the H component includes a color range of the flying mouse, and then a maximum square or rectangular region of the flying mouse in the image can be acquired through human-computer interaction as a first frame selection image corresponding to the third image and a second frame selection image corresponding to the fourth image. And calculating histograms of colors in the first frame selection image and the second frame selection image according to built-in functions in Opencv, and calculating a reverse projection image corresponding to the first frame selection image and a reverse projection image corresponding to the second frame selection image according to the histograms. It is understood that the color of the flying mouse is the color with the largest scale in the reverse projection view, and is always a white sphere in the reverse projection view due to the spherical characteristic of the flying mouse. As the flying mouse moves, it is necessary to track its position and determine where the flying mouse is.
Specifically, the first CMOS camera may be controlled to acquire an image of the flying mouse at a preset position, the second CMOS camera may acquire an image of the flying mouse at the preset position, and when it is determined that the image acquired by the first CMOS camera matches the first framing image and the image acquired by the second CMOS camera matches the second framing image, the image acquired by the first CMOS camera may be used as the first image and the image acquired by the second CMOS camera may be used as the second image. It can be understood that the position of the flying mouse is determined by using the tracking frame, the tracking frame is a corresponding frame body in the first frame selection image and the second frame selection image, the centroid of the color in the tracking frame is calculated through a meanshift algorithm, when the centroid deviates from the center of the image, the centroid is taken as a new center, the image is collected again, when the center in the collected image is coincident with the centroid, the position where the tracking frame is located is the position of the flying mouse, and at the moment, the matching degree of the image of the tracking frame with the first frame selection image and the second frame selection image is the highest, namely the color matching degree is the highest. Thus, a first image acquired by the first CMOS camera and a second image acquired by the second CMOS camera at the preset position of the flying mouse can be obtained.
Step S122: and acquiring a first reverse projection drawing corresponding to the first image and a second reverse projection drawing corresponding to the second image, wherein the first reverse projection drawing and the second reverse projection drawing comprise areas corresponding to the flying mouse.
Then, according to the method for obtaining the reverse projection view in step S122, a first reverse projection view of the first image and a second reverse projection view corresponding to the second image are obtained. The first and second back projection views include regions corresponding to the flying mouse.
Step S123: and acquiring a second transformation matrix corresponding to the coordinate points in the area corresponding to the flying mouse in the first reverse projection drawing and the second reverse projection drawing and a third coordinate point meeting the second transformation matrix according to a second preset algorithm.
And then acquiring the coordinates of the white circular area according to the corresponding first reverse projection drawing and the second reverse projection drawing at the same preset position, and matching the coordinates of the white circular area in the first reverse projection drawing and the second reverse projection drawing by using a RANSAC matching algorithm to obtain the best matching point. The specific matching algorithm idea is as follows: randomly choosing a RANSAC sample from a sample set, namely 4 matching point pairs; calculating a transformation matrix M according to the 4 matching point pairs; calculating a consistent set meeting the current transformation matrix according to the sample set, the transformation matrix M and the error metric function, and returning the number of elements in the consistent set; judging whether the optimal (maximum) consistent set exists according to the number of elements in the current consistent set, and if so, updating the current optimal consistent set; and updating the current error probability p, and if the current error probability p is greater than the allowed minimum error probability, repeating the steps and continuing iteration until the current error probability p is less than the minimum error probability. Thus, an optimal transformation matrix can be finally found as the second transformation matrix, and an optimal point satisfying the second transformation matrix is obtained as the third coordinate point. It is understood that the third coordinate point may be a coordinate point in a white region in the first back projection view corresponding to the first image captured by the first CMOS camera.
Step S124: and acquiring a second abscissa, a second ordinate and a first ordinate of the flying mouse at the preset position based on the third coordinate point, the second transformation matrix and a preset model.
Then, based on the second transformation matrix, a point corresponding to the flying mouse in the back projection image corresponding to the other camera corresponding to the third coordinate point is obtained. And then the coordinates of the flying mouse can be calculated based on a preset model. The schematic diagram of the preset model can be seen in fig. 4, and the preset model is a model corresponding to the position relationship of the images shot by the binocular CMOS camera. The left camera can be a first CMOS camera, the right camera can be a second CMOS camera, the left view can be a first image, the right view can be a second image, B is the distance between the optical center of the first CMOS camera and the optical center of the second CMOS camera, P is a mouse, and P is a pointleftAs a point of the flying mouse in the first image, PrightAs a point of the flying mouse in the second image, XleftAs the lateral coordinate, X, of the flying mouse in the first imagerightThe horizontal coordinate of the flying mouse in the second image is taken as Y, the longitudinal coordinate of the flying mouse in the first image is consistent with the longitudinal coordinate of the flying mouse in the second image, and the origin is O in the first imageLD is parallax, i.e. Xright-XleftAnd f is the focal length. According to the triangular geometric relationship:
Figure BDA0001435290710000101
the following can be obtained:
Figure BDA0001435290710000102
therefore, the second abscissa, the second ordinate, and the first ordinate of the flying mouse, i.e., the point P, can be calculated from the substituted third coordinate point, i.e., the first spatial coordinate composed of the second abscissa, the second ordinate, and the first ordinate is obtained.
Step S125: repeating the acquisition the first CMOS camera and the second CMOS camera of binocular CMOS camera are gathered respectively the flying mouse is in one the step of the first image and the second image of default position, extremely based on the third coordinate point the second transform matrix and the default model acquire the flying mouse is in the step of the second abscissa, the second ordinate and the first ordinate of default position obtain the flying mouse is in the first space coordinate that the second abscissa, the second ordinate and the first ordinate of a plurality of default positions constitute.
Repeating the steps S121 to S124 at each preset position, the first spatial coordinates of the flying mouse at each preset position can be obtained, so as to obtain a plurality of first spatial coordinates corresponding to the flying mouse at the plurality of preset positions.
Step S130: and obtaining a plurality of first plane coordinates and a plurality of first position coordinates corresponding to the first space coordinates based on a first preset algorithm.
After the first plane coordinate and the first space coordinate corresponding to the flying mouse at the plurality of preset positions are obtained, the transverse coordinate and the longitudinal coordinate of the first plane coordinate and the first space coordinate are corrected.
Specifically, the first plane coordinate includes a first abscissa and a first ordinate, and the first space coordinate includes a second abscissa, a second ordinate, and a first ordinate. Referring to fig. 5, step S130 may include:
step S131: and obtaining a correction function of the plurality of first abscissas and the plurality of first ordinates, and the plurality of second abscissas and the plurality of second ordinates based on a preset coordinate correction algorithm.
Specifically, step S131 may include: obtaining a first transformation matrix of a first set formed by a plurality of first horizontal coordinates and a plurality of vertical coordinates mapped to a second set formed by a plurality of second horizontal coordinates and a plurality of second vertical coordinates based on a preset feature matching algorithm; acquiring a difference value set of a third set of the second set which is subjected to the mapping of the first transformation matrix and the second set; calculating partial derivatives of the squares of the difference sets by using a least square method, and obtaining a first coordinate point formed by a first abscissa and a first ordinate corresponding to the difference point with the minimum deviation and a second coordinate point formed by a second abscissa and a second ordinate; generating a first fitted straight-line function as the correction function based on the first coordinate point and the second coordinate point. Obtaining a first transformation matrix of a first set formed by a plurality of first horizontal coordinates and a plurality of vertical coordinates mapped to a second set formed by a plurality of second horizontal coordinates and a plurality of second vertical coordinates based on a preset feature matching algorithm; acquiring a difference value set of a third set of the second set which is subjected to the mapping of the first transformation matrix and the second set; calculating partial derivatives of the squares of the difference sets by using a least square method, and obtaining a first coordinate point formed by a first abscissa and a first ordinate corresponding to the difference point with the minimum deviation and a second coordinate point formed by a second abscissa and a second ordinate; generating a first fitted straight-line function as the correction function based on the first coordinate point and the second coordinate point.
In the embodiment of the present invention, the preset feature matching algorithm may be a RANSAC algorithm. Of course, the specific preset feature matching algorithm is not limited in the embodiment of the present invention.
Step S132: and correcting the plurality of first abscissas, the plurality of first ordinates, the plurality of second abscissas and the plurality of second ordinates based on the correction function to obtain a third abscissa and a third ordinate of the flying mouse at a plurality of preset positions.
After the correction function is obtained, the plurality of first abscissas, the plurality of first ordinates, the plurality of second abscissas, and the plurality of second ordinates are corrected according to the correction function which is the first fitted straight-line function. Specifically, a second abscissa and a second ordinate in the same first spatial coordinate may constitute a second planar coordinate. And then, taking a point close to the straight line corresponding to the first fitted straight line function, so as to obtain the remaining corrected plane coordinates in the first plane coordinate and the second plane coordinate, wherein the abscissa and the ordinate in each remaining plane coordinate are respectively used as a third abscissa and a third ordinate, namely, the third abscissa and the third ordinate of the flying rat at a plurality of preset positions are obtained.
Step S133: generating first position coordinates of the flying mouse at a plurality of preset positions based on the plurality of third abscissa coordinates, the plurality of third ordinate coordinates, and the plurality of first ordinate coordinates.
And then generating a first position coordinate of the flying mouse at each preset position according to the third horizontal coordinate, the third vertical coordinate and the first vertical coordinate of the flying mouse at each preset position. Therefore, the first position coordinates of the flying rat at a plurality of preset positions can be obtained, and the first position coordinates comprise horizontal coordinates, vertical coordinates and vertical coordinates. In addition, the abscissa and the ordinate are obtained after correction according to the gyroscope data of the flying mouse and positioning data obtained based on an image shot by the binocular CMOS camera, and the reliability is high.
Step S140: and generating second position data corresponding to the flying mouse at the preset positions respectively based on the first position coordinates and the attitude angle data.
After the first position coordinate comprising the horizontal coordinate, the vertical coordinate and the vertical coordinate is obtained, the second position data of the flying mouse at each preset position can be obtained based on the attitude angle data measured by the gyroscope sensor of the flying mouse, namely the final six degrees of freedom alpha, beta, gamma, x, y and z of the flying mouse are obtained. Therefore, three-dimensional posture data of the flying rat, namely six degrees of freedom, can be well acquired under dark conditions.
Second embodiment
Referring to fig. 6, the three-dimensional pose positioning apparatus 200 according to a second embodiment of the present invention includes a first data obtaining module 210, a second data obtaining module 220, a processing module 230, and a data generating module 240. The first data acquisition module 210 is configured to acquire a plurality of first position data, which are respectively corresponding to a plurality of preset positions of a flying rat, where the sensor of the flying rat is provided with an infrared diode inside, and the first position data includes attitude angle data and first plane coordinates; the second data acquisition module 220 is configured to obtain a plurality of first spatial coordinates corresponding to the flying mouse at the plurality of preset positions respectively according to position images of the flying mouse at the plurality of preset positions, which are acquired by a binocular CMOS camera provided with an infrared filter; the processing module 230 is configured to obtain a plurality of first plane coordinates and a plurality of first position coordinates corresponding to the plurality of first spatial coordinates based on a first preset algorithm; the data generating module 240 is configured to generate second position data corresponding to the flying mouse at the plurality of preset positions based on the plurality of first position coordinates and the plurality of attitude angle data.
In an embodiment of the present invention, the first plane coordinate includes a first abscissa and a first ordinate, and the first space coordinate includes a second abscissa, a second ordinate, and a first ordinate. Referring to fig. 7, the processing module 230 includes a correction function obtaining unit 231, a correction unit 232, and an execution unit 233. The correction function obtaining unit 231 is configured to obtain correction functions of the first abscissa and the first ordinate, and the second abscissa and the second ordinate based on a preset coordinate correction algorithm; the correcting unit 232 is configured to correct the first abscissas, the first ordinates, the second abscissas and the second ordinates based on the correction function, so as to obtain third abscissas and third ordinates of the flying rat at a plurality of preset positions; the execution unit 233 is configured to generate first position coordinates of the flying rat at a plurality of preset positions based on the plurality of third abscissa coordinates, the plurality of third ordinate coordinates, and the plurality of first ordinate coordinates.
Further, the correction function acquiring unit 231 includes a first correction function acquiring sub-unit, a second correction function acquiring sub-unit, a third correction function acquiring sub-unit, and a fourth correction function acquiring sub-unit. The first correction function obtaining subunit is configured to obtain, based on a preset feature matching algorithm, a first transformation matrix in which a first set formed by a plurality of first abscissa coordinates and a plurality of ordinate coordinates is mapped to a second set formed by a plurality of second abscissa coordinates and a plurality of second ordinate coordinates; the second correction function acquiring subunit is configured to acquire a difference value set of the second set and a third set of the second set mapped by the first transformation matrix; the third correction function obtaining subunit is configured to use a least square method to solve a partial derivative of the square of the difference set, and obtain a first coordinate point formed by a first abscissa and a first ordinate corresponding to the difference point with the smallest deviation, and a second coordinate point formed by a second abscissa and a second ordinate; the fourth correction function acquisition subunit is configured to generate a first fitted straight-line function as the correction function based on the first coordinate point and the second coordinate point.
In the embodiment of the present invention, referring to fig. 8, the second data acquiring module 220 includes an image acquiring unit 221, an image processing unit 222, a matching processing unit 223, and a coordinate acquiring unit 224. The image acquisition unit 221 is configured to acquire a first image and a second image of the flying mouse at a preset position, which are acquired by a first CMOS camera and a second CMOS camera of the binocular CMOS camera, respectively; the image processing unit 222 is configured to obtain a first reverse projection view corresponding to the first image and a second reverse projection view corresponding to the second image, where the first reverse projection view and the second reverse projection view include a region corresponding to the flying mouse; the matching processing unit 223 is configured to obtain a second transformation matrix corresponding to a coordinate point in an area corresponding to the hamster in the first reverse projection diagram and the second reverse projection diagram and a third coordinate point satisfying the second transformation matrix according to a second preset algorithm; the coordinate obtaining unit 224 is configured to obtain a second abscissa, a second ordinate, and a first ordinate of the flying mouse at the preset position based on the third coordinate point, the second transformation matrix, and a preset model.
Further, the image acquiring unit 221 includes a first image acquiring sub-unit, a second image acquiring sub-unit, and a third image acquiring sub-unit. The first image acquisition subunit is used for controlling the first CMOS camera and the second CMOS camera to acquire a third image and a fourth image of the flying mouse; the second image obtaining subunit is configured to obtain a first frame selection image and a second frame selection image corresponding to the third image and the fourth image, respectively, where the first frame selection image and the second frame selection image are images corresponding to frame selection areas corresponding to the flying mouse; the third image obtaining subunit is configured to, when the images of the flying mouse at the preset position, respectively acquired by the first CMOS camera and the second CMOS camera, are matched with the first framing image and the second framing image, take the image acquired by the first CMOS camera as a first image, and take the image acquired by the second CMOS camera as a second image.
Third embodiment
A third embodiment of the present invention provides an electronic device 100, referring to fig. 1, the electronic device 100 includes a memory 102 and a processor 106, the memory 102 is coupled to the processor 106, the memory 102 stores instructions, and when the instructions are executed by the processor 106, the instructions cause the processor 106 to: acquiring a plurality of first position data which are respectively corresponding to a plurality of preset positions of a flying mouse and are measured by a sensor of the flying mouse internally provided with an infrared diode, wherein the first position data comprise attitude angle data and a first plane coordinate; obtaining a plurality of first space coordinates respectively corresponding to the flying mouse at the plurality of preset positions according to position images of the flying mouse at the plurality of preset positions, which are acquired by a binocular CMOS camera provided with an infrared filter; obtaining a plurality of first plane coordinates and a plurality of first position coordinates corresponding to the first space coordinates based on a first preset algorithm; and generating second position data corresponding to the flying mouse at the preset positions respectively based on the first position coordinates and the attitude angle data.
In summary, according to the three-dimensional attitude positioning method, the three-dimensional attitude positioning device, and the electronic device provided in the embodiments of the present invention, a plurality of first position data, which are respectively corresponding to a plurality of preset positions of a flying mouse measured by a sensor of the flying mouse provided with an infrared diode inside, are obtained, where the first position data include attitude angle data and first plane coordinates; then, according to position images of the flying mouse at a plurality of preset positions, which are acquired by a binocular CMOS camera provided with an infrared filter, a plurality of first space coordinates, which correspond to the flying mouse at the plurality of preset positions respectively, are obtained; obtaining a plurality of first plane coordinates and a plurality of first position coordinates corresponding to the plurality of first space coordinates based on a first preset algorithm; and finally, generating second position data corresponding to the volleyball at a plurality of preset positions respectively based on the plurality of first position coordinates and the plurality of attitude angle data. Therefore, the three-dimensional attitude data corresponding to the flying mouse can be obtained under the dark condition, and the problem that the three-dimensional attitude positioning data of the flying mouse cannot be obtained under the dark condition is solved.
It should be noted that, in the present specification, the embodiments are all described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments may be referred to each other. For the device-like embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, the functional modules in the embodiments of the present invention may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes. It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (8)

1. A method of three-dimensional pose localization, the method comprising:
acquiring a plurality of first position data which are respectively corresponding to a plurality of preset positions of a flying mouse and are measured by a sensor of the flying mouse internally provided with an infrared diode, wherein the first position data comprise attitude angle data and a first plane coordinate;
obtaining a plurality of first space coordinates respectively corresponding to the flying mouse at the plurality of preset positions according to position images of the flying mouse at the plurality of preset positions, which are acquired by a binocular CMOS camera provided with an infrared filter;
obtaining a plurality of first plane coordinates and a plurality of first position coordinates corresponding to the first space coordinates based on a first preset algorithm;
generating second position data corresponding to the flying mouse at the plurality of preset positions respectively based on the plurality of first position coordinates and the plurality of attitude angle data;
wherein, according to what the binocular CMOS camera that is provided with infrared filter gathered the mouse is in the position image of a plurality of default positions obtains the mouse is in a plurality of first space coordinates that a plurality of default positions correspond respectively include:
acquiring a first image and a second image of the flying mouse at a preset position, wherein the first image and the second image are respectively acquired by a first CMOS camera and a second CMOS camera of the binocular CMOS camera;
acquiring a first reverse projection drawing corresponding to the first image and a second reverse projection drawing corresponding to the second image, wherein the first reverse projection drawing and the second reverse projection drawing comprise areas corresponding to the flying mouse;
acquiring a second transformation matrix corresponding to coordinate points in a region corresponding to the flying mouse in the first reverse projection drawing and the second reverse projection drawing and a third coordinate point meeting the second transformation matrix according to a RANSAC matching algorithm;
acquiring a second abscissa, a second ordinate and a first ordinate of the flying mouse at the preset position based on the third coordinate point, the second transformation matrix and a preset model;
repeating the acquisition the first CMOS camera and the second CMOS camera of binocular CMOS camera are gathered respectively the flying mouse is in one the step of the first image and the second image of default position, extremely based on the third coordinate point the second transform matrix and the default model acquire the flying mouse is in the step of the second abscissa, the second ordinate and the first ordinate of default position obtain the flying mouse is in the first space coordinate that the second abscissa, the second ordinate and the first ordinate of a plurality of default positions constitute.
2. The method according to claim 1, wherein the first planar coordinates comprise a first abscissa and a first ordinate, the first spatial coordinates comprise a second abscissa, a second ordinate, and a first ordinate, and the obtaining a plurality of the first planar coordinates and a plurality of first position coordinates corresponding to the plurality of the first spatial coordinates based on a first preset algorithm comprises:
obtaining a correction function of the plurality of first abscissas and the plurality of first ordinates, and the plurality of second abscissas and the plurality of second ordinates based on a preset coordinate correction algorithm;
correcting the plurality of first abscissas, the plurality of first ordinates, the plurality of second abscissas and the plurality of second ordinates based on the correction function to obtain third abscissas and third ordinates of the flying rat at a plurality of preset positions;
generating first position coordinates of the flying mouse at a plurality of preset positions based on the plurality of third abscissa coordinates, the plurality of third ordinate coordinates, and the plurality of first ordinate coordinates.
3. The method according to claim 2, wherein said obtaining a correction function based on a preset coordinate correction algorithm for said plurality of said first abscissas and said plurality of said first ordinates, and for said plurality of said second abscissas and said plurality of said second ordinates comprises:
obtaining a first transformation matrix which is mapped by a first set formed by a plurality of first horizontal coordinates and a plurality of first vertical coordinates to a second set formed by a plurality of second horizontal coordinates and a plurality of second vertical coordinates on the basis of a preset feature matching algorithm;
acquiring a difference value set of a third set of the second set which is subjected to the mapping of the first transformation matrix and the second set;
calculating partial derivatives of the squares of the difference sets by using a least square method, and obtaining a first coordinate point formed by a first abscissa and a first ordinate corresponding to the difference point with the minimum deviation and a second coordinate point formed by a second abscissa and a second ordinate;
generating a first fitted straight-line function as the correction function based on the first coordinate point and the second coordinate point.
4. The method of claim 1, wherein the obtaining of the first and second images of the flying mouse at the predetermined position, respectively, captured by the first and second CMOS cameras of the binocular CMOS camera comprises:
controlling the first CMOS camera and the second CMOS camera to acquire a third image and a fourth image of the flying mouse;
acquiring a first frame selection image and a second frame selection image which respectively correspond to the third image and the fourth image, wherein the first frame selection image and the second frame selection image are images corresponding to frame selection areas corresponding to the flying mouse;
and when the images of the flying mouse at the preset position, which are respectively acquired by the first CMOS camera and the second CMOS camera, are respectively matched with the first frame selection image and the second frame selection image, the image acquired by the first CMOS camera is used as a first image, and the image acquired by the second CMOS camera is used as a second image.
5. A three-dimensional attitude determination apparatus, characterized in that the apparatus comprises: a first data acquisition module, a second data acquisition module, a processing module, and a data generation module, wherein,
the first data acquisition module is used for acquiring a plurality of first position data which are respectively corresponding to a flying mouse at a plurality of preset positions and are measured by a sensor of the flying mouse internally provided with an infrared diode, and the first position data comprise attitude angle data and first plane coordinates;
the second data acquisition module is used for acquiring a plurality of first space coordinates corresponding to the flying mouse at the plurality of preset positions respectively according to position images of the flying mouse at the plurality of preset positions, which are acquired by a binocular CMOS camera provided with an infrared filter;
the processing module is used for obtaining a plurality of first plane coordinates and a plurality of first position coordinates corresponding to the first space coordinates based on a first preset algorithm;
the data generation module is used for generating second position data corresponding to the flying mouse at the plurality of preset positions respectively based on the plurality of first position coordinates and the plurality of attitude angle data;
the second data acquisition module comprises an image acquisition unit, an image processing unit, a matching processing unit and a coordinate acquisition unit, wherein the image acquisition unit is used for acquiring a first image and a second image of the flying mouse at a preset position, which are acquired by a first CMOS camera and a second CMOS camera of the binocular CMOS camera respectively; the image processing unit is used for acquiring a first reverse projection drawing corresponding to the first image and a second reverse projection drawing corresponding to the second image, and the first reverse projection drawing and the second reverse projection drawing comprise areas corresponding to the flying mouse; the matching processing unit is used for acquiring a second transformation matrix corresponding to a coordinate point in a region corresponding to the flying mouse in the first reverse projection drawing and the second reverse projection drawing and a third coordinate point meeting the second transformation matrix according to a RANSAC matching algorithm; the coordinate obtaining unit is used for obtaining a second abscissa, a second ordinate and a first ordinate of the flying mouse at the preset position based on the third coordinate point, the second transformation matrix and a preset model.
6. The apparatus of claim 5, wherein the first planar coordinate comprises a first abscissa and a first ordinate, wherein the first spatial coordinate comprises a second abscissa, a second ordinate, and a first ordinate, and wherein the processing module comprises a correction function acquisition unit, a correction unit, and an execution unit, wherein,
the correction function acquisition unit is used for acquiring correction functions of the plurality of first abscissas, the plurality of first ordinates, the plurality of second abscissas and the plurality of second ordinates based on a preset coordinate correction algorithm;
the correction unit is used for correcting the plurality of first abscissas, the plurality of first ordinates, the plurality of second abscissas and the plurality of second ordinates based on the correction function to obtain third abscissas and third ordinates of the flying mouse at a plurality of preset positions;
the execution unit is used for generating first position coordinates of the flying mouse at a plurality of preset positions based on the plurality of third horizontal coordinates, the plurality of third vertical coordinates and the plurality of first vertical coordinates.
7. The apparatus according to claim 6, wherein the correction function acquisition unit includes a first correction function acquisition sub-unit, a second correction function acquisition sub-unit, a third correction function acquisition sub-unit, and a fourth correction function acquisition sub-unit, wherein,
the first correction function obtaining subunit is configured to obtain, based on a preset feature matching algorithm, a first transformation matrix in which a first set formed by the plurality of first abscissa coordinates and the plurality of first ordinate coordinates is mapped to a second set formed by the plurality of second abscissa coordinates and the plurality of second ordinate coordinates;
the second correction function acquiring subunit is configured to acquire a difference value set of the second set and a third set of the second set mapped by the first transformation matrix;
the third correction function obtaining subunit is configured to use a least square method to solve a partial derivative of the square of the difference set, and obtain a first coordinate point formed by a first abscissa and a first ordinate corresponding to the difference point with the smallest deviation, and a second coordinate point formed by a second abscissa and a second ordinate;
the fourth correction function acquisition subunit is configured to generate a first fitted straight-line function as the correction function based on the first coordinate point and the second coordinate point.
8. An electronic device comprising a memory and a processor, the memory coupled to the processor, the memory storing instructions that, when executed by the processor, cause the processor to:
acquiring a plurality of first position data which are respectively corresponding to a plurality of preset positions of a flying mouse and are measured by a sensor of the flying mouse internally provided with an infrared diode, wherein the first position data comprise attitude angle data and a first plane coordinate;
obtaining a plurality of first space coordinates respectively corresponding to the flying mouse at the plurality of preset positions according to position images of the flying mouse at the plurality of preset positions, which are acquired by a binocular CMOS camera provided with an infrared filter;
obtaining a plurality of first plane coordinates and a plurality of first position coordinates corresponding to the first space coordinates based on a first preset algorithm;
generating second position data corresponding to the flying mouse at the plurality of preset positions respectively based on the plurality of first position coordinates and the plurality of attitude angle data;
wherein, according to what the binocular CMOS camera that is provided with infrared filter gathered the mouse is in the position image of a plurality of default positions obtains the mouse is in a plurality of first space coordinates that a plurality of default positions correspond respectively include:
acquiring a first image and a second image of the flying mouse at a preset position, wherein the first image and the second image are respectively acquired by a first CMOS camera and a second CMOS camera of the binocular CMOS camera;
acquiring a first reverse projection drawing corresponding to the first image and a second reverse projection drawing corresponding to the second image, wherein the first reverse projection drawing and the second reverse projection drawing comprise areas corresponding to the flying mouse;
acquiring a second transformation matrix corresponding to coordinate points in a region corresponding to the flying mouse in the first reverse projection drawing and the second reverse projection drawing and a third coordinate point meeting the second transformation matrix according to a RANSAC matching algorithm;
acquiring a second abscissa, a second ordinate and a first ordinate of the flying mouse at the preset position based on the third coordinate point, the second transformation matrix and a preset model;
repeating the acquisition the first CMOS camera and the second CMOS camera of binocular CMOS camera are gathered respectively the flying mouse is in one the step of the first image and the second image of default position, extremely based on the third coordinate point the second transform matrix and the default model acquire the flying mouse is in the step of the second abscissa, the second ordinate and the first ordinate of default position obtain the flying mouse is in the first space coordinate that the second abscissa, the second ordinate and the first ordinate of a plurality of default positions constitute.
CN201710961392.1A 2017-10-17 2017-10-17 Three-dimensional attitude positioning method and device and electronic equipment Active CN107608541B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710961392.1A CN107608541B (en) 2017-10-17 2017-10-17 Three-dimensional attitude positioning method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710961392.1A CN107608541B (en) 2017-10-17 2017-10-17 Three-dimensional attitude positioning method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN107608541A CN107608541A (en) 2018-01-19
CN107608541B true CN107608541B (en) 2021-03-05

Family

ID=61077410

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710961392.1A Active CN107608541B (en) 2017-10-17 2017-10-17 Three-dimensional attitude positioning method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN107608541B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112017238B (en) * 2019-05-30 2024-07-19 北京初速度科技有限公司 Method and device for determining spatial position information of linear object
CN111031558A (en) * 2019-12-23 2020-04-17 安徽理工大学 Mobile traffic prediction method and device, computer equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101238428A (en) * 2005-08-22 2008-08-06 叶勤中 Free-space pointing and handwriting
CN107102749A (en) * 2017-04-23 2017-08-29 吉林大学 A kind of three-dimensional pen type localization method based on ultrasonic wave and inertial sensor

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103443746B (en) * 2010-12-22 2017-04-19 Z空间股份有限公司 Three-dimensional tracking of a user control device in a volume
CN103337066B (en) * 2013-05-27 2016-05-18 清华大学 3D obtains the calibration steps of system
CN104007846A (en) * 2014-05-22 2014-08-27 深圳市宇恒互动科技开发有限公司 Three-dimensional figure generating method and electronic whiteboard system
CN106152937B (en) * 2015-03-31 2019-10-25 深圳超多维科技有限公司 Space positioning apparatus, system and method
CN105222772B (en) * 2015-09-17 2018-03-16 泉州装备制造研究所 A kind of high-precision motion track detection system based on Multi-source Information Fusion
CN106204605B (en) * 2016-07-15 2019-02-19 上海乐相科技有限公司 A kind of localization method and device
CN106372552B (en) * 2016-08-29 2019-03-26 北京理工大学 Human body target recognition positioning method
CN106525003B (en) * 2016-12-16 2019-02-12 深圳市未来感知科技有限公司 A kind of attitude measurement method based on binocular vision

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101238428A (en) * 2005-08-22 2008-08-06 叶勤中 Free-space pointing and handwriting
CN107102749A (en) * 2017-04-23 2017-08-29 吉林大学 A kind of three-dimensional pen type localization method based on ultrasonic wave and inertial sensor

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张冬忙,邓忠平,赵宝龙,常林.一种应用于空中鼠标的数据校正方法.《广播电视信息》.2013,第54-57页. *

Also Published As

Publication number Publication date
CN107608541A (en) 2018-01-19

Similar Documents

Publication Publication Date Title
US11481923B2 (en) Relocalization method and apparatus in camera pose tracking process, device, and storage medium
US11605214B2 (en) Method, device and storage medium for determining camera posture information
CN111750820B (en) Image positioning method and system
US10924729B2 (en) Method and device for calibration
CN105283905B (en) Use the robust tracking of Points And lines feature
US20220358663A1 (en) Localization and Tracking Method and Platform, Head-Mounted Display System, and Computer-Readable Storage Medium
US20170316610A1 (en) Assembly instruction system and assembly instruction method
WO2022139901A1 (en) Method and system of image processing with multi-object multi-view association
JP5833507B2 (en) Image processing device
CN107704106B (en) Attitude positioning method and device and electronic equipment
CN107608541B (en) Three-dimensional attitude positioning method and device and electronic equipment
CN110726971B (en) Visible light positioning method, device, terminal and storage medium
CN110737326A (en) Virtual object display method and device, terminal equipment and storage medium
JP2018517312A (en) Cluster-based photo navigation
US11758100B2 (en) Portable projection mapping device and projection mapping system
CN114092668A (en) Virtual-real fusion method, device, equipment and storage medium
US20210248769A1 (en) Array-based depth estimation
CN117372475A (en) Eyeball tracking method and electronic equipment
CN114638921B (en) Motion capture method, terminal device, and storage medium
KR101431378B1 (en) Method for generating omni-directional image and apparatus therefor
CN111223139A (en) Target positioning method and terminal equipment
CN116091701A (en) Three-dimensional reconstruction method, three-dimensional reconstruction device, computer equipment and storage medium
US11158119B2 (en) Systems and methods for reconstructing a three-dimensional object
KR102605451B1 (en) Electronic device and method for providing multiple services respectively corresponding to multiple external objects included in image
WO2016129154A1 (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20221124

Address after: 212310 Workshop 7 #, Dezi Industrial Park, south of Liyao Road, Danyang Development Zone, Zhenjiang City, Jiangsu Province

Patentee after: Jiangsu shiruidi photoelectric Co.,Ltd.

Patentee after: NINGBO THREDIM OPTOELECTRONICS Co.,Ltd.

Address before: 315000 No.58, Jingu Middle Road (West), Yinzhou District, Ningbo City, Zhejiang Province

Patentee before: NINGBO THREDIM OPTOELECTRONICS Co.,Ltd.

TR01 Transfer of patent right