Nothing Special   »   [go: up one dir, main page]

CN108021241A - A kind of method for realizing AR glasses virtual reality fusions - Google Patents

A kind of method for realizing AR glasses virtual reality fusions Download PDF

Info

Publication number
CN108021241A
CN108021241A CN201711251652.2A CN201711251652A CN108021241A CN 108021241 A CN108021241 A CN 108021241A CN 201711251652 A CN201711251652 A CN 201711251652A CN 108021241 A CN108021241 A CN 108021241A
Authority
CN
China
Prior art keywords
mrow
glasses
msub
axis
mfrac
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201711251652.2A
Other languages
Chinese (zh)
Other versions
CN108021241B (en
Inventor
谢辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Xiaolong Technology Co Ltd
Original Assignee
Xi'an Xiaolong Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xi'an Xiaolong Technology Co Ltd filed Critical Xi'an Xiaolong Technology Co Ltd
Priority to CN201711251652.2A priority Critical patent/CN108021241B/en
Publication of CN108021241A publication Critical patent/CN108021241A/en
Application granted granted Critical
Publication of CN108021241B publication Critical patent/CN108021241B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention belongs to virtual reality and augmented reality field, more particularly to a kind of method for realizing AR glasses virtual reality fusions, including obtaining displacement and rotation angle of the glasses in three dimensions, the pixel to be moved of dummy object on a display screen is calculated according to the displacement and rotation angle, treat rotation angle and enlargement ratio, and according to the pixel to be moved, treat that rotation angle and enlargement ratio adjust the pose of dummy object on a display screen in real time, with to the effect of virtual reality fusion, wherein by obtaining acceleration and the angular acceleration values that glasses are produced in three dimensions, and integral operation is carried out to acceleration and angular acceleration, obtain displacement and rotation angle of the glasses in three dimensions.The present invention by imaging light push over and largely experiment has obtained calculating dummy object pixel to be moved on a display screen, treats rotation angle and the method and formula of enlargement ratio, new thinking and research direction are provided for virtual reality fusion technology.

Description

A kind of method for realizing AR glasses virtual reality fusions
Technical field
The invention belongs to virtual reality and augmented reality field, and in particular to a kind of AR glasses virtual reality fusions realized Method.
Background technology
With the development of science and technology, AR and VR are increasingly favored be subject to people.VR, that is, virtual reality, is set using computer Standby to simulate a virtual world, including the information such as the vision of people, the sense of hearing, tactile, i.e. VR sees that hear is computer mould entirely Draw up what is come, it has out-tricked the sensorium of people, gives people a kind of sensation on the spot in person.And AR is to arrive virtual information superposition In the real world, so as to reach a kind of sensory experience of exceeding reality, i.e. the scene that AR sees some be it is genuine, one Part is false, gives people a kind of sensation of exceeding reality.Both technologies are all by related hardware and picture, are built to user One virtual scene, allows user to stay wherein, mode and the virtual scenes such as user can be moved by gesture, voice, body Interact.
Virtual reality fusion is realized using AR technologies, in the field such as can be widely applied to play, entertain, works as user in the prior art When head is moved, head, which shows dummy object in equipment, can not make corresponding state change, such dummy object and real world object Position relatively moves, and can not realize the virtual reality fusion that dummy object and real world object are servo-actuated, the experience sense of such user is just It can decline, in order to realize the seamless effect merged of virtual world information and real-world information, it is necessary to real-time according to self-position The image space of dummy object is adjusted, since the imaging system of most of AR glasses uses the Display Techniques such as OLED, LCOS, LCD, Therefore the position of dummy object and posture in adjustment display screen in real time are needed, ensures the dummy object that is presented in AR glasses and true Relative position keeps fixing between real object, so as to fulfill the effect of virtual reality fusion.
The content of the invention
In order to realize that dummy object is with corresponding adjustment is made, so as to fulfill void with the movement that equipment is shown with account The effect merged in fact, the present invention provide a kind of method for realizing AR glasses virtual reality fusions, give especially by following technical scheme Realize:
A kind of method for realizing AR glasses virtual reality fusions, for the change according to AR glasses poses, adjusts AR glasses in real time The pose of virtual object image on display screen, including:
Displacement and rotation angle of the AR glasses in three dimensions are obtained, is calculated according to the displacement of acquisition and rotation angle Dummy object pixel to be moved on a display screen, treat rotation angle and enlargement ratio, and according to the pixel to be moved of calculating, treat Rotation angle and enlargement ratio to adjust the pose of virtual object image on AR glasses display screens in real time.
Wherein, AR glasses are obtained in the displacement and rotation angle of three dimensions, are specifically included:AR glasses are obtained in three-dimensional The acceleration and angular acceleration values that space produces, and integral operation is carried out to acceleration and angular acceleration, AR glasses are obtained three The displacement and rotation angle of dimension space.
Wherein, further included in acquisition AR glasses before the acceleration and angular acceleration values that three dimensions produces:Establish The mobile message of one coordinate system mark AR glasses in three dimensions, establishes the second coordinate system and marks dummy object in display screen Position, be specially:
Using the direction of two eyeball lines as x-axis, to find a view camera axle center pointing direction as y-axis in front of AR glasses, structure Xoy planes are built, using vertical xoy planes as z-axis, establish the first coordinate system xoyz;Using plane where AR glasses display screen as σ o ρ, Using vertical σ o ρ planes as η axis, the second coordinate system σ o ρ η are established.
Wherein, acceleration and the angular acceleration values that AR glasses are produced in three dimensions are obtained, and acceleration and angle are accelerated Degree carries out integral operation, obtains displacement and rotation angle of the AR glasses in three dimensions, specifically includes:
Using inertial sensor and gyroscope obtain respectively AR glasses in the first coordinate system the acceleration of three axial directions and Angular acceleration values, and carry out integral operation to acceleration and angular acceleration, obtain three axis of the AR glasses in the first coordinate system Displacement is respectively Sx、Sy、Sz, and it is respectively α to obtain the rotation angles of AR glasses along three axisx、αy、αz
Wherein, the picture to be moved of dummy object on a display screen is calculated according to the displacement and rotation angle Element, treat rotation angle and enlargement ratio;Specifically include, when AR glasses are respectively in the displacement of three axis of the first coordinate system Sx、Sy、Sz, the rotation angle for obtaining AR glasses along three axis is respectively αx、αy、αzWhen,
The pixel to be moved of dummy object ρ axis on a display screen is:cρ=csx+caz, wherein
The pixel to be moved of dummy object σ axis on a display screen is:cσ=csz+cax, wherein
Dummy object treats that rotation angle is θ along display screen σ axisσ, wherein
Dummy object treats that rotation angle is θ along display screen ρ axisρ, wherein
Dummy object treats that rotation angle is θ along display screen η axisη, wherein θηy
The enlargement ratio of dummy object is ∈, wherein,
Wherein h is the image-forming range of AR optical systems, and d is eye retina to the distance of AR eyeglass, λ1For human eye Imaging light and the angle of yoz planes between dummy object, λ2Imaging light and xoy between human eye and dummy object The angle of plane, enlargement ratio of the AR glasses optical systems from display screen to optical mirror slip are β, and display screen is in ρ direction of principal axis length It is m in the resolution ratio of ρ direction of principal axis for m1, display screen is n in the length of σ direction of principal axis, is n in the resolution ratio of σ direction of principal axis1
Above technical scheme compared with prior art, has following technique effect:
The present invention is by research as user AR glasses move in three dimensions, and dummy object is in display screen in AR glasses In mobile status, analyzed the imaging of dummy object and advised as the movement of AR glasses and the relative position of real world object change Rule, further shows the propagation light path of middle dummy object, by AR glasses existing by analyzing human eye light and real world object and head Movement and rotation in real three dimensions correspond to dummy object movement on a display screen and the rotation around three-dimensional space countershaft; Further by obtaining the displacement and rotation angle of its three dimensions, and virtual object is calculated according to displacement and rotation angle Body pixel to be moved on a display screen, treat rotation angle and enlargement ratio, so as to adjust dummy object in display screen in real time On pose so that between the dummy object and real-world object that are presented in AR glasses relative position keep fix so that maximum Degree realizes the effect of virtual reality fusion.
Brief description of the drawings
Fig. 1 is flow chart of the method for the present invention;
Fig. 2 is the first coordinate system schematic diagram that the present invention establishes;
Fig. 3 is the second coordinate system and virtual image principle schematic that the present invention establishes;
The implication of each label in figure:The first coordinate systems of 1-, the second coordinate systems of 2-, 3- display screens, 4- human eyes, 5- half-reflection and half-transmissions Eyeglass, 6- dummy object imagings seen by person, 7-AR glasses.
Embodiment
The method provided by the invention for realizing AR glasses virtual reality fusions, wherein the foundation of the first coordinate system and the second coordinate system It can be adjusted according to the relative position of display screen and optics module, it is not limited to the method disclosed in the present invention, specifically, with The direction of two eyeball lines is x-axis, to find a view camera axle center pointing direction as y-axis in front of AR glasses, builds xoy planes, Using vertical xoy planes as z-axis, the first coordinate system xoyz is established;Again using the display screen in AR as plane, using vertical plane to be another A axis, establishes the second coordinate system σ o ρ η.
In addition the optical imagery reasoning by inventor and lot of experiment validation, obtain when user wears AR glasses in three-dimensional When being moved in space, mobile pixel and amplification factor and Three dimensional rotation of the dummy object in display screen plane can be adjusted Amount is, it can be achieved that virtual reality fusion, by method provided by the invention, technical staff according to AR glasses the first coordinate system three axis Displacement Sx、Sy、Sz, the rotation angle of glasses along three axis is respectively αx、αy、αz, dummy object can be obtained and shown The pixel number c moved on screen along ρ axisρWith the pixel number c moved along σ axisσ, it is further real by optical analysis and largely Testing to obtain, and (can be analyzed to moving along two reference axis) when dummy object moves in the first coordinate system, dummy object It is corresponding to be moved where display screen in plane and (can be analyzed to moving along two reference axis), reference axis in the first coordinate system Positive direction determine after, cause dummy object in the first coordinate system to be moved along positive in picture moving direction in display screen Direction is defined as the positive direction of the second coordinate system, after two axis positive directions of three-dimensional coordinate system determine, according to the right-hand rule It can determine that the positive direction of the 3rd axis, you can determine the positive direction of the 3rd axis of the second coordinate system.
In addition, after the positive direction of the first coordinate system is determined, displacement of the AR glasses in three axis of the first coordinate system Sx、Sy、SzVector representation is used, is specifically moved along positive direction as direction as just;Around the amount of spin α of three axisx、αy、 αzAlso vector representation is used, specifically allows the right hand to hold reference axis, thumb point coordinates axis positive direction, four refer to bending direction as rotation The positive direction of gyration, similarly the regulation be also applied in the second coordinate system, dummy object is obtained by calculation on a display screen ρ axis and the mobile pixel number of σ axis be also vector, pixel number moves dummy object along the positive direction of the axis for canonical, Pixel number then moves dummy object along the negative direction of the axis to be negative, adjusts amount of spin of the dummy object in the second coordinate system When, regulation is combined according to direction vector and is adjusted.
When the user of the present invention wears AR glasses, the scope moved every time is smaller, and to be real-time continuous, acquisition glasses The acceleration and angular acceleration values produced in three dimensions, and integral operation is carried out to acceleration and angular acceleration, obtain glasses The displacement and rotation angle of relatively last position in three dimensions, in order to reduce the error in integral and calculating, Δ t should use up Measure small.
The present invention can first define the first coordinate system of Δ t start times, and pass through the Δ t times in each Δ t Displacement and angle in section, calculate the amount of movement of dummy object, amount of spin, enlargement ratio on display screen and adjust, press Calculate the adjustment amount of dummy object on the display screen in next Δ t again according to same method.Allow in the arithmetic speed of computer In the case of, Δ t should be as far as possible small, to obtain continuous, real-time regulating effect.
Since the display field angle of AR glasses is limited, with the movement on head, the adjustment of dummy object, which may exceed, to be shown The edge of display screen, this does not influence the function of the present invention, and when dummy object exceeds the edge of display screen, mobile pixel number can be after Continuous adjustment, until movement appears on display screen dummy object again.
The method of the present invention, is not limited to calculate displacement distance and angle of rotation by integrating using acceleration or angular acceleration Degree, by other means, get both values to realize that the method for virtual reality fusion effect is also within the scope of the present invention, Acceleration magnitude and angular acceleration values are the one of which input condition that the present invention realizes virtual reality fusion effect, can obtain it is mobile away from Have much from the method with rotational angle, for example can be achieved by depth camera, exterior positioner etc..
To make the purpose, technical scheme and advantage of the embodiment of the present invention clearer, below in conjunction with attached drawing and implementation The present invention is described in further detail for example.
Embodiment 1
If Fig. 1 is flow chart of the method for the present invention, the present invention realizes in accordance with the following methods, such as Fig. 2, Fig. 3, with two eyeballs The direction of line is x-axis, to find a view camera axle center pointing direction as y-axis in front of AR glasses, xoy planes is built, with vertical xoy Plane is z-axis, establishes the first coordinate system xoyz, using plane where display screen as σ o ρ, using vertical σ o ρ planes as η axis, establishes the Two coordinate system σ o ρ η are, it is specified that the positive direction of the first coordinate system, for rotational angle, it is specified that allowing the right hand to hold reference axis, thumb Point coordinates axis positive direction, four finger bending directions are positive direction, this regulation is equally applicable to the second coordinate system, in order to facilitate reader Understand and the second coordinate system is corresponded to dummy object seen by person and is imaged at 6 positions by analysis, the present embodiment, such as Fig. 3, this reality Apply a glasses and all use vector representation along the displacement and amount of spin of each axis in the first coordinate, be specially displacement along coordinate Axis positive direction, is negative if displacement is along the negative direction of reference axis, if rotation direction is identical with defined positive direction as just It is then conversely negative just to be then.
When user wears the movement of AR glasses, glasses are obtained respectively in the first coordinate system using inertial sensor and gyroscope In three axial directions acceleration and angular acceleration values, the traveling time of glasses is t, and the time of whole movement locus is divided into K sections, wherein i-th section of velocity amplitude is vi, i-th section of acceleration magnitude is αi, each period is Δ t, when Δ t → 0, with x-axis Exemplified by, in the t periods, displacement of the glasses in x-axis isIt can similarly obtain in y-axis and z Displacement S on axisyAnd Sz.The angular acceleration in three axis of gyroscope output is ξx、ξy、ξz, can be by obtaining by taking x-axis as an example Rotation angle α of the glasses in t moment in x-axisx, whereinIt may finally similarly derive Equipment is in t moment rotation angle αy、αz
It is respectively S to obtain glasses in the displacement of three axis of the first coordinate system by integral operation in the present embodimentx= 0.08m、Sy=0.1m, Sz=-0.15m, the rotation angle for obtaining glasses along three axis are respectively αx=3 °, αy=-2 °, αz= At 1 °,
Obtained by measurement, the image-forming range h=3m of AR optical systems, the distance d=of eye retina to eyeglass 0.03m, imaging light and the included angle X of yoz planes between human eye and dummy object1=5 °, between human eye and dummy object It is imaged light and the included angle X of xoy planes2=5 °, enlargement ratio of the AR glasses optical systems from display screen to optical mirror slip be β= 1.5, display screen is m=0.008m in ρ direction of principal axis length, is m in the resolution ratio of ρ direction of principal axis1=1024, display screen is in σ axis sides To length be n=0.007m, be n in the resolution ratio of σ direction of principal axis1=768, it is obtained by calculation:Dummy object is in display screen The pixel number of upper ρ axis movement is:C ρ=csx+caz=-23.2, wherein
Dummy object on a display screen σ axis movement pixel number be:cσ=csz+cax=-6.7, wherein
Need on ρ axis and σ axis to need mobile pixel according to gained dummy object is calculated, it is then assumed that ρ axis and σ axis Positive direction, according to the picture for calculating the mobile pixel mobile virtual object on both axes, when user sees in AR glasses The position of the dummy object arrived and the relative position of real-world object and glasses movement before it is identical when, will at this time dummy object along The direction of ρ axis and the movement of σ axis determines the positive direction of η axis further according to the right-hand rule, is obtaining the second coordinate system just as positive direction Direction, as shown in figure 3, then calculating the amount of spin that dummy object is adjusted around the needs of three axis of the second coordinate system.
The rotation angle of dummy object along display screen σ axis is θσ=-1.5 °, whereinI.e. Dummy object is rotated 1.5 ° around negative direction as defined in σ axial directions;
The rotation angle of dummy object along display screen ρ axis is θρ=-2.853 °, wherein Dummy object is rotated 2.853 ° along ρ axis according to the negative direction of regulation;
The rotation angle of dummy object along display screen η axis is θη=2 °, wherein θηy;By dummy object along η axis according to It is defined to rotate forward 2 °;
By above-mentioned corresponding adjustment, position and posture of the dummy object in display screen have all reached void with real world object Intend fusion, but size also needs to adjust, and according to the principle of " remote small near big ", void is obtained with many experiments by a series of push over The enlargement ratio for intending object is ∈, during virtual reality fusion, it is also contemplated that the amplification factor of dummy object, when enlargement ratio etc. When 1, dummy object neither amplifies nor reduces, and when enlargement ratio is more than or less than 1, then dummy object is amplified ∈ Times, the present embodiment calculates ∈=1.042, i.e., dummy object is amplified 1.042 times, and the adjustment more than, dummy object exists Position and real world object in display screen have reached the virtual fusion of height.
Position provided by the invention by adjusting dummy object reaches the method for virtual reality fusion, and syncretizing effect is good, error Small, all calculating are completed by Software for Design, and new thinking and research side are provided for the virtual reality fusion technology of the prior art To.It should be noted that above-described embodiment is only one embodiment of the present of invention mode, the scope that the present invention is protected not only limits In above-described embodiment.

Claims (5)

1. a kind of method for realizing AR glasses virtual reality fusions, for the change according to AR glasses poses, adjusts AR glasses and shows in real time The pose of virtual object image in display screen, it is characterised in that including:
Displacement and rotation angle of the AR glasses in three dimensions are obtained, is calculated according to the displacement of acquisition and rotation angle virtual Object pixel to be moved on a display screen, treat rotation angle and enlargement ratio, and according to the pixel to be moved of calculating, wait to rotate Angle and enlargement ratio to adjust the pose of virtual object image on AR glasses display screens in real time.
2. the method as described in claim 1, it is characterised in that displacement and rotation of the acquisition AR glasses in three dimensions Gyration, specifically includes:
Acceleration and the angular acceleration values that AR glasses are produced in three dimensions are obtained, and acceleration and angular acceleration are integrated Computing, obtains displacement and rotation angle of the AR glasses in three dimensions.
3. method as claimed in claim 2, it is characterised in that obtaining acceleration and the angle that AR glasses are produced in three dimensions Further included before acceleration magnitude:
The mobile message of the first coordinate system mark AR glasses in three dimensions is established, establishes the second coordinate system mark dummy object Position in display screen, is specially:
Using the direction of two eyeball lines as x-axis, to find a view camera axle center pointing direction as y-axis in front of AR glasses, xoy is built Plane, using vertical xoy planes as z-axis, establishes the first coordinate system xoyz;Using plane where AR glasses display screen as σ o ρ, with vertical σ o ρ planes are η axis, establish the second coordinate system σ o ρ η.
4. the method as described in claim 1-3 is any, it is characterised in that what the acquisition AR glasses were produced in three dimensions Acceleration and angular acceleration values, and integral operation is carried out to acceleration and angular acceleration, AR glasses are obtained in the position of three dimensions Shifting amount and rotation angle, specifically include:
AR the glasses acceleration of three axial directions and angle in the first coordinate system are obtained respectively using inertial sensor and gyroscope to add Velocity amplitude, and integral operation is carried out to acceleration and angular acceleration, obtain displacement of the AR glasses in three axis of the first coordinate system Amount is respectively Sx、Sy、Sz, and it is respectively α to obtain the rotation angles of AR glasses along three axisx、αy、αz
5. the method as described in claim 1-4 is any, it is characterised in that displacement and rotation angle described in the basis Calculate dummy object pixel to be moved on a display screen, treat rotation angle and enlargement ratio;Specifically include, when AR glasses exist The displacement of three axis of the first coordinate system is respectively Sx、Sy、Sz, obtain AR glasses is respectively along the rotation angle of three axis αx、αy、αzWhen,
The pixel to be moved of dummy object ρ axis on a display screen is:cρ=csx+caz, wherein
The pixel to be moved of dummy object σ axis on a display screen is:cσ=csz+cax, wherein
<mrow> <msub> <mi>c</mi> <mrow> <mi>s</mi> <mi>z</mi> </mrow> </msub> <mo>=</mo> <mo>-</mo> <mfrac> <mrow> <mi>d</mi> <mo>&amp;CenterDot;</mo> <msub> <mi>S</mi> <mi>z</mi> </msub> <mo>&amp;CenterDot;</mo> <msub> <mi>n</mi> <mn>1</mn> </msub> </mrow> <mrow> <mi>h</mi> <mo>&amp;CenterDot;</mo> <mi>&amp;beta;</mi> <mo>&amp;CenterDot;</mo> <mi>n</mi> </mrow> </mfrac> <mo>,</mo> <msub> <mi>c</mi> <mrow> <mi>a</mi> <mi>x</mi> </mrow> </msub> <mo>=</mo> <mo>-</mo> <mfrac> <mrow> <mi>d</mi> <mo>&amp;CenterDot;</mo> <mo>&amp;lsqb;</mo> <mi>t</mi> <mi>a</mi> <mi>n</mi> <mrow> <mo>(</mo> <msub> <mi>&amp;lambda;</mi> <mn>2</mn> </msub> <mo>+</mo> <msub> <mi>&amp;alpha;</mi> <mi>x</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mi>tan&amp;lambda;</mi> <mn>2</mn> </msub> <mo>&amp;rsqb;</mo> <mo>&amp;CenterDot;</mo> <msub> <mi>n</mi> <mn>1</mn> </msub> </mrow> <mrow> <mi>&amp;beta;</mi> <mo>&amp;CenterDot;</mo> <mi>n</mi> </mrow> </mfrac> <mo>,</mo> </mrow>
Dummy object treats that rotation angle is θ along display screen σ axisσ, wherein
Dummy object treats that rotation angle is θ along display screen ρ axisρ, wherein
Dummy object treats that rotation angle is θ along display screen η axisη, wherein θηy
The enlargement ratio of dummy object is ∈, wherein,
<mrow> <mo>&amp;Element;</mo> <mo>=</mo> <mfrac> <mi>h</mi> <mrow> <mi>h</mi> <mo>-</mo> <msub> <mi>S</mi> <mi>y</mi> </msub> </mrow> </mfrac> <mo>&amp;CenterDot;</mo> <mfrac> <mn>1</mn> <mrow> <mn>1</mn> <mo>-</mo> <mfrac> <mn>1</mn> <mrow> <mi>cos</mi> <mrow> <mo>(</mo> <mi>d</mi> <mo>&amp;CenterDot;</mo> <msub> <mi>tan&amp;lambda;</mi> <mn>1</mn> </msub> <mo>-</mo> <mfrac> <mrow> <mi>d</mi> <mo>&amp;CenterDot;</mo> <msub> <mi>S</mi> <mi>x</mi> </msub> </mrow> <mi>h</mi> </mfrac> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>+</mo> <mfrac> <mn>1</mn> <mrow> <msub> <mi>cos&amp;lambda;</mi> <mn>1</mn> </msub> </mrow> </mfrac> </mrow> </mfrac> <mo>&amp;CenterDot;</mo> <mfrac> <mn>1</mn> <mrow> <mn>1</mn> <mo>-</mo> <mfrac> <mn>1</mn> <mrow> <mi>cos</mi> <mrow> <mo>(</mo> <mi>d</mi> <mo>&amp;CenterDot;</mo> <msub> <mi>tan&amp;lambda;</mi> <mn>2</mn> </msub> <mo>-</mo> <mfrac> <mrow> <mi>d</mi> <mo>&amp;CenterDot;</mo> <msub> <mi>S</mi> <mi>z</mi> </msub> </mrow> <mi>h</mi> </mfrac> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>+</mo> <mfrac> <mn>1</mn> <mrow> <msub> <mi>cos&amp;lambda;</mi> <mn>2</mn> </msub> </mrow> </mfrac> </mrow> </mfrac> <mo>;</mo> </mrow>
Wherein h is the image-forming range of AR optical systems, and d is eye retina to the distance of AR eyeglass, λ1For human eye and virtually Imaging light and the angle of yoz planes between object, λ2Imaging light and xoy planes between human eye and dummy object Angle, enlargement ratio of the AR glasses optical systems from display screen to optical mirror slip are β, and display screen is m in ρ direction of principal axis length, in ρ The resolution ratio of direction of principal axis is m1, display screen is n in the length of σ direction of principal axis, is n in the resolution ratio of σ direction of principal axis1
CN201711251652.2A 2017-12-01 2017-12-01 Method for realizing virtual-real fusion of AR glasses Active CN108021241B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711251652.2A CN108021241B (en) 2017-12-01 2017-12-01 Method for realizing virtual-real fusion of AR glasses

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711251652.2A CN108021241B (en) 2017-12-01 2017-12-01 Method for realizing virtual-real fusion of AR glasses

Publications (2)

Publication Number Publication Date
CN108021241A true CN108021241A (en) 2018-05-11
CN108021241B CN108021241B (en) 2020-08-25

Family

ID=62078206

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711251652.2A Active CN108021241B (en) 2017-12-01 2017-12-01 Method for realizing virtual-real fusion of AR glasses

Country Status (1)

Country Link
CN (1) CN108021241B (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108771859A (en) * 2018-06-22 2018-11-09 腾讯科技(深圳)有限公司 Virtual scene display method, apparatus, electronic device and storage medium
CN108776003A (en) * 2018-06-08 2018-11-09 歌尔股份有限公司 A kind of detection method of VR equipment
CN108805917A (en) * 2018-05-25 2018-11-13 网易(杭州)网络有限公司 Sterically defined method, medium, device and computing device
CN109218709A (en) * 2018-10-18 2019-01-15 北京小米移动软件有限公司 The method of adjustment and device and computer readable storage medium of holographic content
CN109214351A (en) * 2018-09-20 2019-01-15 太平洋未来科技(深圳)有限公司 A kind of AR imaging method, device and electronic equipment
CN110708384A (en) * 2019-10-12 2020-01-17 西安维度视界科技有限公司 Interaction method, system and storage medium of AR-based remote assistance system
CN111528920A (en) * 2020-05-25 2020-08-14 居天智慧(深圳)有限公司 Augmented reality observation device for ultrasound device
WO2021008367A1 (en) * 2019-07-16 2021-01-21 于毅欣 Moving method and device in virtual reality
CN112396718A (en) * 2020-10-09 2021-02-23 广东电网有限责任公司阳江供电局 On-site construction safety and quality supervision research system based on AR technology
CN112634462A (en) * 2020-12-21 2021-04-09 上海影创信息科技有限公司 Temperature matching augmented reality method and system for AR glasses
CN112764658A (en) * 2021-01-26 2021-05-07 北京小米移动软件有限公司 Content display method and device and storage medium
CN114742977A (en) * 2022-03-30 2022-07-12 青岛虚拟现实研究院有限公司 Video perspective method based on AR technology
US11880956B2 (en) 2019-08-28 2024-01-23 Shenzhen Sensetime Technology Co., Ltd. Image processing method and apparatus, and computer storage medium
CN118070554A (en) * 2024-04-16 2024-05-24 北京天创凯睿科技有限公司 Flight simulation mixed reality display system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102157011A (en) * 2010-12-10 2011-08-17 北京大学 Method for carrying out dynamic texture acquisition and virtuality-reality fusion by using mobile shooting equipment
CN105279750A (en) * 2014-07-09 2016-01-27 雷震 Equipment display guiding system based on IR-UWB and image moment
CN105677021A (en) * 2015-12-31 2016-06-15 周岩 New method based on fusion of AR system and VR system and applying new method to recurrence of relics
US20170278306A1 (en) * 2016-03-25 2017-09-28 Sony Computer Entertainment Inc. Virtual Reality (VR) Cadence Profile Adjustments for Navigating VR Users in VR Environments

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102157011A (en) * 2010-12-10 2011-08-17 北京大学 Method for carrying out dynamic texture acquisition and virtuality-reality fusion by using mobile shooting equipment
CN105279750A (en) * 2014-07-09 2016-01-27 雷震 Equipment display guiding system based on IR-UWB and image moment
CN105677021A (en) * 2015-12-31 2016-06-15 周岩 New method based on fusion of AR system and VR system and applying new method to recurrence of relics
US20170278306A1 (en) * 2016-03-25 2017-09-28 Sony Computer Entertainment Inc. Virtual Reality (VR) Cadence Profile Adjustments for Navigating VR Users in VR Environments

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108805917A (en) * 2018-05-25 2018-11-13 网易(杭州)网络有限公司 Sterically defined method, medium, device and computing device
CN108805917B (en) * 2018-05-25 2021-02-23 杭州易现先进科技有限公司 Method, medium, apparatus and computing device for spatial localization
CN108776003A (en) * 2018-06-08 2018-11-09 歌尔股份有限公司 A kind of detection method of VR equipment
CN108776003B (en) * 2018-06-08 2019-10-22 歌尔股份有限公司 A kind of detection method of VR equipment
CN108771859A (en) * 2018-06-22 2018-11-09 腾讯科技(深圳)有限公司 Virtual scene display method, apparatus, electronic device and storage medium
CN109214351A (en) * 2018-09-20 2019-01-15 太平洋未来科技(深圳)有限公司 A kind of AR imaging method, device and electronic equipment
CN109214351B (en) * 2018-09-20 2020-07-07 太平洋未来科技(深圳)有限公司 AR imaging method and device and electronic equipment
US11409241B2 (en) 2018-10-18 2022-08-09 Beijing Xiaomi Mobile Software Co., Ltd. Method and apparatus for adjusting holographic content and computer readable storage medium
CN109218709A (en) * 2018-10-18 2019-01-15 北京小米移动软件有限公司 The method of adjustment and device and computer readable storage medium of holographic content
WO2021008367A1 (en) * 2019-07-16 2021-01-21 于毅欣 Moving method and device in virtual reality
US11880956B2 (en) 2019-08-28 2024-01-23 Shenzhen Sensetime Technology Co., Ltd. Image processing method and apparatus, and computer storage medium
CN110708384A (en) * 2019-10-12 2020-01-17 西安维度视界科技有限公司 Interaction method, system and storage medium of AR-based remote assistance system
CN111528920A (en) * 2020-05-25 2020-08-14 居天智慧(深圳)有限公司 Augmented reality observation device for ultrasound device
CN112396718A (en) * 2020-10-09 2021-02-23 广东电网有限责任公司阳江供电局 On-site construction safety and quality supervision research system based on AR technology
CN112634462A (en) * 2020-12-21 2021-04-09 上海影创信息科技有限公司 Temperature matching augmented reality method and system for AR glasses
CN112764658A (en) * 2021-01-26 2021-05-07 北京小米移动软件有限公司 Content display method and device and storage medium
CN114742977A (en) * 2022-03-30 2022-07-12 青岛虚拟现实研究院有限公司 Video perspective method based on AR technology
CN118070554A (en) * 2024-04-16 2024-05-24 北京天创凯睿科技有限公司 Flight simulation mixed reality display system

Also Published As

Publication number Publication date
CN108021241B (en) 2020-08-25

Similar Documents

Publication Publication Date Title
CN108021241A (en) A kind of method for realizing AR glasses virtual reality fusions
US11577159B2 (en) Realistic virtual/augmented/mixed reality viewing and interactions
CA3023488C (en) System and method for generating a progressive representation associated with surjectively mapped virtual and physical reality image data
CN108008817B (en) Method for realizing virtual-actual fusion
US11068056B2 (en) Wearable eye tracking system with slippage detection and correction
US20160267720A1 (en) Pleasant and Realistic Virtual/Augmented/Mixed Reality Experience
CN104536579A (en) Interactive three-dimensional scenery and digital image high-speed fusing processing system and method
US10819898B1 (en) Imaging device with field-of-view shift control
US9440484B2 (en) 3D digital painting
US11914762B2 (en) Controller position tracking using inertial measurement units and machine learning
US9734622B2 (en) 3D digital painting
SE527257C2 (en) Device and method for presenting an external image
CN106598252A (en) Image display adjustment method and apparatus, storage medium and electronic device
CN104637080B (en) A kind of three-dimensional drawing system and method based on man-machine interaction
TWI453462B (en) Telescopic observation for virtual reality system and method thereof using intelligent electronic device
JP2021193613A (en) Animation creation method
JPH0351407B2 (en)
JP2006185448A (en) Distance computing device
US10296098B2 (en) Input/output device, input/output program, and input/output method
JP2016134668A (en) Electronic spectacle and electronic spectacle control method
RU2601169C1 (en) Method and device for interaction with virtual objects
Vu et al. Hand pose detection in hmd environments by sensor fusion using multi-layer perceptron
Aloor et al. Design of VR headset using augmented reality
JP6479835B2 (en) I / O device, I / O program, and I / O method
JP6479836B2 (en) I / O device, I / O program, and I / O method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 710061 Xi'an new hi tech Zone, Shaanxi, A 604, national digital publishing base, Tian Gu Road, software city.

Applicant after: XI'AN VIDOAR TECHNOLOGY Co.,Ltd.

Address before: 710061 Xi'an new hi tech Zone, Shaanxi, A 604, national digital publishing base, Tian Gu Road, software city.

Applicant before: XI'AN XLOONG TECHNOLOGIES Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant