CN108881892B - Anti-dizziness method and system for desktop virtual reality system - Google Patents
Anti-dizziness method and system for desktop virtual reality system Download PDFInfo
- Publication number
- CN108881892B CN108881892B CN201810503485.4A CN201810503485A CN108881892B CN 108881892 B CN108881892 B CN 108881892B CN 201810503485 A CN201810503485 A CN 201810503485A CN 108881892 B CN108881892 B CN 108881892B
- Authority
- CN
- China
- Prior art keywords
- eye
- coordinate
- user
- glasses
- lens
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 44
- 208000002173 dizziness Diseases 0.000 title claims abstract description 19
- 210000001508 eye Anatomy 0.000 claims abstract description 112
- 239000011521 glass Substances 0.000 claims abstract description 59
- 125000006850 spacer group Chemical group 0.000 claims description 5
- 230000001131 transforming effect Effects 0.000 claims description 5
- 230000005540 biological transmission Effects 0.000 claims description 3
- 239000011159 matrix material Substances 0.000 claims description 2
- 230000009466 transformation Effects 0.000 claims description 2
- 230000000694 effects Effects 0.000 abstract description 6
- 208000012886 Vertigo Diseases 0.000 description 10
- 231100000889 vertigo Toxicity 0.000 description 10
- 238000010586 diagram Methods 0.000 description 8
- 210000005252 bulbus oculi Anatomy 0.000 description 5
- 208000003464 asthenopia Diseases 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 239000000463 material Substances 0.000 description 3
- 206010028813 Nausea Diseases 0.000 description 2
- 206010047700 Vomiting Diseases 0.000 description 2
- 230000008693 nausea Effects 0.000 description 2
- 230000008673 vomiting Effects 0.000 description 2
- 208000003556 Dry Eye Syndromes Diseases 0.000 description 1
- 206010013774 Dry eye Diseases 0.000 description 1
- 206010047513 Vision blurred Diseases 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 206010016256 fatigue Diseases 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 208000024891 symptom Diseases 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Landscapes
- Processing Or Creating Images (AREA)
Abstract
The invention discloses an anti-dizziness method and system for a desktop type virtual reality system, and relates to the technical field of virtual reality. The method comprises the following steps: determining the position data of the left eye and the right eye of a user according to the position data of the center point of the left lens and the position data of the center point of the right lens of the 3D glasses and a preset offset; obtaining posture data of the left eye and the right eye of the user according to the position data of the left eye and the right eye of the user; and respectively assigning the posture data of the left eye and the right eye of the user to a left camera and a right camera in a virtual scene in the 3D display, so that the virtual scene is updated in real time according to the eye position of the user. The anti-dizziness method and the anti-dizziness system provided by the invention realize scene follow-up, so that the display effect of the virtual scene is more in line with the feeling of human eyes, and the dizziness of a user when using the desktop type virtual reality system can be effectively prevented.
Description
Technical Field
The invention relates to the technical field of virtual reality, in particular to an anti-dizziness method and system for a desktop virtual reality system.
Background
The desktop virtual reality system is a small virtual reality system formed by a 3D stereoscopic display, tracking equipment, interaction equipment and a PC.
At present, virtual reality equipment has a fatal problem, namely the phenomena of dizziness, nausea and even vomiting of a user are easily caused, and visual fatigue is also easily caused, so that the development and the wide promotion of a virtual reality technology are restricted by the problem.
At present, methods for preventing vertigo mainly focus on how to increase the refresh frequency of a screen, the update speed of a scene, and the like, and there are mainly the following technical methods:
1. the screen resolution is improved. The resolution of screen display determines the definition of display, and improving the resolution of screen can reduce visual fatigue and vertigo to a certain extent.
2. The tracking delay is reduced. The tracking delay can cause the phenomenon of inconsistent operation intention and pause in the interaction process, the tracking accuracy is improved, the delay time is reduced, and the dizziness can be reduced to a certain extent.
3. The refresh frequency is increased. Display content blockage can be reduced by increasing the refreshing frequency and the frame rate, and vertigo feeling can be reduced to a certain extent.
However, the above method can only reduce the vertigo of the user through the upgrade of the hardware configuration, and has no obvious effect on preventing the vertigo of the user.
Disclosure of Invention
The invention aims to solve the technical problem of the prior art and provides an anti-dizziness method and system for a desktop virtual reality system.
The technical scheme for solving the technical problems is as follows:
an anti-vignetting method for a desktop virtual reality system, the desktop virtual reality system comprising: 3D glasses and a 3D display, the anti-glare method comprising:
determining the position data of the left eye and the right eye of the user according to the position data of the center point of the left lens of the 3D glasses, the position data of the center point of the right lens of the 3D glasses and a preset offset;
obtaining the posture data of the left eye and the right eye of the user according to the position data of the left eye and the right eye of the user;
and assigning the posture data of the left eye and the right eye of the user to a left camera and a right camera in a virtual scene in the 3D display respectively, so that the virtual scene is updated in real time according to the eye position of the user.
The invention has the beneficial effects that: according to the anti-dizziness method provided by the invention, the eye space position information of the user is obtained and assigned to the camera in the virtual scene of the desktop virtual reality system, so that the camera updates the scene in real time along with the eye space position of the user, scene follow-up is realized, the display effect of the virtual scene is more consistent with the feeling of human eyes, and the vertigo feeling of the user when using the desktop virtual reality system can be effectively prevented.
Another technical solution of the present invention for solving the above technical problems is as follows:
an anti-glare system for a desktop virtual reality system, the desktop virtual reality system comprising: 3D glasses and a 3D display, the anti-glare system comprising:
the first processing unit is used for determining the position data of the left eye and the right eye of the user according to the position data of the center point of the left lens and the position data of the center point of the right lens of the 3D glasses and a preset offset;
the second processing unit is used for obtaining the posture data of the left eye and the right eye of the user according to the position data of the left eye and the right eye of the user;
and the transmission unit is used for respectively assigning the posture data of the left eye and the right eye of the user to the left camera and the right camera in the virtual scene in the 3D display so as to update the virtual scene in real time according to the eye position of the user.
Another technical solution of the present invention for solving the above technical problems is as follows:
a terminal for controlling the desktop virtual reality system using the anti-glare method according to any one of the above technical solutions.
Another technical solution of the present invention for solving the above technical problems is as follows:
a storage medium having stored therein instructions, which when read by a computer, cause the computer to execute the anti-glare method according to any one of the above-described embodiments.
Advantages of additional aspects of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
FIG. 1 is a schematic flow chart diagram illustrating an embodiment of an anti-glare method for a desktop virtual reality system according to the present invention;
FIG. 2 is a schematic diagram illustrating an offset between an eyeball and a lens according to an embodiment of the anti-glare method for a desktop virtual reality system of the present invention;
FIG. 3 is a schematic diagram illustrating a distribution of mark points provided by an embodiment of an anti-vignetting method for a desktop virtual reality system according to the present invention;
FIG. 4 is a schematic diagram illustrating a position relationship of mark points provided by an embodiment of an anti-vignetting method for a desktop virtual reality system according to the present invention;
FIG. 5 is a schematic diagram illustrating position identification of mark points provided by an embodiment of an anti-glare method for a desktop virtual reality system according to the present invention;
FIG. 6 is a schematic diagram of a rectangular coordinate system provided by an embodiment of an anti-glare method for a desktop virtual reality system according to the present invention;
fig. 7 is a structural framework diagram provided by an embodiment of the anti-glare system for the desktop virtual reality system according to the invention.
Detailed Description
The principles and features of this invention are described below in conjunction with the following drawings, which are set forth to illustrate, but are not to be construed to limit the scope of the invention.
As shown in fig. 1, a schematic flow chart is provided for an embodiment of an anti-dizziness method for a desktop virtual reality system according to the present invention, where the desktop virtual reality system includes: the 3D glasses, the 3D display and the interaction device enable a user to watch the virtual image displayed by the 3D display after wearing the 3D glasses, and the user can interact with the virtual image displayed in the 3D display through the interaction device. However, when some users use the desktop virtual reality system, a feeling of vertigo may occur, and vertigo cannot be effectively prevented by means of increasing screen resolution, refreshing frequency and the like, based on which the present embodiment provides a method for preventing vertigo, the method comprising:
and S1, determining the position data of the left eye of the user according to the position data of the center point of the left lens of the 3D glasses and the preset offset, and determining the position data of the right eye of the user according to the position data of the center point of the right lens of the 3D glasses and the preset offset.
It should be noted that the preset offset is an offset between an eyeball and a lens, and when a user wears the 3D glasses, the lens of the glasses and the eyeball of the user are not completely overlapped, as shown in fig. 2, a certain distance exists between the eyeball and the lens, and this distance is an offset D. The position data of the left and right eyes of the user can be obtained by adding the offset to the position data of the central points of the left and right lenses respectively.
Since different users have different facial structures, the offset can be set according to actual requirements. For example, the offset may be d 0.025m, which is determined by the distance between the normal eyeglasses and the eyeball being 2-3 cm.
It should be understood that, in order to obtain the position data of the center points of the left and right lenses of the 3D glasses, mark points may be set on the 3D glasses in advance, and since the positions of the mark points are fixed on the 3D glasses, the positions of the mark points relative to the center points of the left and right lenses are also fixed, so that after the position data of the mark points are identified, the position data of the center points of the left and right lenses may be calculated according to the position data of the mark points.
For example, as shown in fig. 3 and 4, 3 mark points may be set on the frame of the left lens by taking the center point of the left lens as the center, and then the position of the center point of the left lens in three-dimensional space can be determined by these 3 mark points, and the right lens may also be set with mark points in the same way.
Specifically, the left camera picture and the right camera picture can be obtained by shooting the 3D glasses through the binocular infrared camera on the 3D display, as shown in fig. 5, assuming that the point P is a mark point on the 3D glasses, the coordinate of the point P in the camera space is (X, y, z), and first, the coordinates of the point P projected on the left camera picture and the right camera picture are obtained, respectively (X, y, z)left,Yleft) And (X)right,Yright) Since both cameras are head-up cameras, Yleft=YrightAnd Y, obtaining the following relation according to the intrinsic parameter focal length f of the camera:
and B is the base line distance of the binocular infrared camera.
From this, the coordinates (x, y, z) of point P in camera space can be calculated:
according to the above formula, the coordinates of all mark points on the 3D glasses can be calculated, and then according to the coordinates of all mark points and the position relationship between the coordinates and the center points of the left and right lenses, the position coordinates of the center points of the left and right lenses, namely the position data of the center points of the left and right lenses, can be obtained.
It can be understood that the image of the glasses can be subjected to image recognition processing, and after the image is subjected to expansion, binarization, filtering and other processing, a contour map of the 3D glasses can be obtained, so that the position data of the center points of the left and right lenses can be determined by an image positioning method.
It can be understood that the 3D glasses can be additionally provided with the attitude sensor, the position data and the attitude data of the 3D glasses are obtained in real time, and the position data of the central points of the left lens and the right lens of the 3D glasses are obtained through calculation according to the position data and the attitude data.
And S2, obtaining the posture data of the left eye and the right eye of the user according to the position data of the left eye and the right eye of the user.
It should be noted that, because the left and right cameras in the virtual scene in the 3D display simulate the eyes of the user and their distances and positions are fixed, their attitude angles are consistent with the attitude angle of the head of the user, and the attitude matrices of the left and right eyes of the user can be obtained by transforming the position data of the left and right eyes of the user into the world space.
It should be understood that if the position data of the 3D glasses is acquired by recognizing the mark point, the posture data of the left and right eyes of the user can be obtained by combining the world space coordinate system of the 3D glasses.
It should be understood that if the position data of the 3D glasses is obtained through the posture sensor, after the position data of the left and right eyes of the user is obtained, the posture data of the left and right eyes of the user can be calculated according to the posture data of the 3D glasses.
And S3, assigning the posture data of the left eye and the right eye of the user to a left camera and a right camera in a virtual scene in the 3D display respectively, and updating the virtual scene in real time according to the eye position of the user.
According to the anti-dizziness method provided by the embodiment, the eye space position information of the user is acquired and assigned to the camera in the virtual scene of the desktop virtual reality system, so that the camera updates the scene in real time along with the eye space position of the user, scene follow-up is realized, the display effect of the virtual scene is more consistent with the feeling of human eyes, and the vertigo of the user when the desktop virtual reality system is used can be effectively prevented.
Optionally, in some embodiments, determining the position data of the left and right eyes of the user according to the position data of the center point of the left lens of the 3D glasses, the position data of the center point of the right lens of the 3D glasses, and the preset offset may specifically include:
respectively acquiring a first central coordinate (x) of a central point of a left lens of the 3D glasses in a camera spacel,yl,zl) And a second center coordinate (x) of the center point of the right lens in camera spacer,yr,zr);
Establishing a rectangular coordinate system of the 3D glasses according to the first central coordinate (x)l,yl,zl) And a second center coordinate (x)r,yr,zr) Calculating to obtain a Z-axis direction vector w of the rectangular coordinate system;
it should be noted that the rectangular coordinate system of the 3D glasses may be set according to actual requirements, and for convenience of calculation, the origin of the rectangular coordinate system may be set at the center between two lenses of the 3D glasses.
According to the first central coordinate (x)l,yl,zl) Second center coordinate (x)r,yr,zr) A preset offset d and a Z-axis direction vector w, and respectively calculating to obtain a first eye coordinate x of the left eye of the user in a camera spacele,xle=xl+ d.w, and a second eye coordinate x of the right eye in camera spacere,xre=xr+d·w。
Optionally, in some embodiments, a first center coordinate (x) of a center point of a left lens of the 3D glasses in camera space is respectively obtainedl,yl,zl) And a second center coordinate (x) of the center point of the right lens in camera spacer,yr,zr) Specifically, the method may include:
respectively acquiring n point coordinates of n preset points on the 3D glasses in a camera space, wherein n is greater than 2;
it should be understood that the preset point here may be a mark point preset on the 3D glasses.
Respectively calculating according to the space geometric position relation of the n point coordinates to obtain a first central coordinate (x) of the central point of the left lens of the 3D glasses in the camera spacel,yl,zl) And a second center coordinate (x) of the center point of the right lens in camera spacer,yr,zr)。
For example, taking the left lens as an example, 3 mark points may be set, the 3 mark points may enclose a triangle, and the center of the triangle coincides with the center point of the left lens, so that the center coordinate of the triangle is obtained through calculation through a geometric relationship of the triangle, and the center coordinate of the left lens can be obtained, and the right lens is similar.
Optionally, in some embodiments, a rectangular coordinate system of the 3D glasses is established, based on the first center coordinate (x)l,yl,zl) And a second center coordinate (x)r,yr,zr) Calculating to obtain a Z-axis direction vector w of the rectangular coordinate system, which may specifically include:
as shown in fig. 6, a rectangular coordinate system is established with the center of the connection line between the center point of the left lens and the center point of the right lens of the 3D glasses as the origin, the direction in which the center points of the left lens and the right lens of the 3D glasses are located is the X axis, and the plane in which the left lens and the right lens of the 3D glasses are located is the XY plane;
according to the first central coordinate (x)l,yl,zl) And a second center coordinate (x)r,yr,zr) Calculating to obtain an X-axis direction vector u of the rectangular coordinate system,
determining a Y-axis direction vector v of the rectangular coordinate system, wherein v is (0, 1, 0);
calculating a Z-axis direction vector w according to the X-axis direction vector u and the Y-axis direction vector v,wherein,indicating a right-hand rule operation.
Optionally, in some embodiments, obtaining the posture data of the left and right eyes of the user according to the position data of the left and right eyes of the user may specifically include:
transforming the matrix M according to a predetermined coordinate systemT-WFor the first eye coordinate x respectivelyleAnd a second eye coordinate xrePerforming coordinate transformation to world space to obtain first eye world coordinate xle′,xle′=MT-W*xleAnd a second eye world coordinate xre′,xre′=MT-W*xle;
Respectively transforming an X-axis direction vector u, a Y-axis direction vector v and a Z-axis direction vector w in a rectangular coordinate system to a world space to respectively obtain an X-axis world direction vector u ', u' -MT-WU, Y-axis world direction vector v', v ═ MT-WV, Z-axis world direction vector w', w ═ MT-W*w;
Respectively according to the first eye world coordinate xle' second eye world coordinate xre' X axis worldCalculating a world direction vector u ', a Y-axis world direction vector v ' and a Z-axis world direction vector w ' to obtain the posture data M of the left eye of the userlAnd pose data M of the right eyer,
Optionally, in some embodiments, the scale, length, shape, and pattern of virtual objects within the virtual scene are the same as real objects within the real scene.
It should be noted that part of human perception of objects in a virtual environment is from the accumulation of knowledge about real objects in reality, and the visual experience is affected by too large or too small a virtual object relative to a real object. Therefore, the size of the object in the virtual scene should be as large as possible and the unit should be the same as the real object in the case of creating the model, and the material and texture of the model should be as similar as possible to those of the real object in the case of creating the virtual model, and the appearance of bright straight lines in the scene should be avoided. This also serves to assist in anti-glare.
Optionally, in some embodiments, the virtual scene is located within a preset region of the 3D display centered on the zero disparity plane.
In the current 3D imaging technology, the convergence effect of both eyes is used to form an image at a spatial point, so that a virtual object jumps out of or sinks into a screen, thereby forming a sense of depth and space. When the virtual object goes out of the screen or enters the screen, convergence and focus adjustment are inconsistent, and eyes continuously balance and adjust and adapt to each other, so that visual fatigue symptoms such as blurred vision, dry eyes, dizziness, nausea, vomiting and the like can be caused after a certain time. The zero parallax plane is a plane with only the convergence and focus consistent, and the farther away from the plane, the greater the convergence and focus difference is, the more fatigue and dizziness are easily generated.
Therefore, the virtual scene is displayed in the preset area of the 3D display with the zero parallax plane as the center, the user can be effectively prevented from being dazzled, and the size of the preset area can be set according to actual requirements.
Preferably, the virtual scene is arranged in this embodiment in a relatively comfortable area of a cube 0.13 meters in front and 0.3 meters behind the zero disparity plane position with respect to the virtual display.
By displaying the virtual scene in the area, better display effect can be obtained, and simultaneously, eye fatigue and dizziness of the user can be reduced.
It is to be understood that some or all of the steps described in the embodiments above may alternatively be included in some embodiments.
As shown in fig. 7, a structural framework diagram is provided for an embodiment of an anti-glare system for a desktop virtual reality system according to the present invention, where the desktop virtual reality system includes: 3D glasses and 3D display, anti-dizziness system includes:
the first processing unit 1 is used for determining the position data of the left eye and the right eye of a user according to the position data of the center point of the left lens and the position data of the center point of the right lens of the 3D glasses and a preset offset;
the second processing unit 2 is used for obtaining the posture data of the left eye and the right eye of the user according to the position data of the left eye and the right eye of the user;
and the transmission unit 3 is used for respectively assigning the posture data of the left eye and the right eye of the user to the left camera and the right camera in the virtual scene in the 3D display, so that the virtual scene is updated in real time according to the eye position of the user.
It should be noted that this embodiment is a product example corresponding to each of the above embodiments, and for the description of each part in this embodiment, reference may be made to the corresponding description in the above embodiments, and details are not repeated here.
In other embodiments of the present invention, a terminal is further provided, wherein the terminal controls a desktop virtual reality system by using the anti-dizziness method as described in any one of the above embodiments.
In other embodiments of the present invention, there is also provided a storage medium having stored therein instructions that, when read by a computer, cause the computer to execute the anti-halation method according to any one of the above embodiments.
The reader should understand that in the description of this specification, reference to the description of the terms "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention essentially or partially contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
While the invention has been described with reference to specific embodiments, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (8)
1. An anti-vignetting method for a desktop virtual reality system, the desktop virtual reality system comprising: 3D glasses and 3D displays, characterized in that the anti-glare method comprises:
determining the position data of the left eye and the right eye of the user according to the position data of the center point of the left lens of the 3D glasses, the position data of the center point of the right lens of the 3D glasses and a preset offset;
obtaining the posture data of the left eye and the right eye of the user according to the position data of the left eye and the right eye of the user;
assigning the posture data of the left eye and the right eye of the user to a left camera and a right camera in a virtual scene in the 3D display respectively, so that the virtual scene is updated in real time according to the eye position of the user;
the method for determining the position data of the left eye and the right eye of the user according to the position data of the central points of the left lens and the right lens of the 3D glasses and the preset offset specifically comprises the following steps:
respectively acquiring n point coordinates of n preset points on the 3D glasses in the camera space, wherein n is greater than 2;
respectively calculating according to the space geometric position relation of the n point coordinates to obtain a first center coordinate (x) of the center point of the left lens of the 3D glasses in the camera spacel,yl,zl) And a second center coordinate (x) of the center point of the right lens in the camera spacer,yr,zr);
Establishing a rectangular coordinate system of the 3D glasses according to the first central coordinate (x)l,yl,zl) And the second center coordinate (x)r,yr,zr) Calculating to obtain a Z-axis direction vector w of the rectangular coordinate system;
according to the first central coordinate (x)l,yl,zl) The second center coordinate (x)r,yr,zr) A preset offset d and the vector w in the Z-axis direction are respectively calculated to obtain a first eye coordinate x of the left eye of the user in the camera spacele,xle=xl+ d.w, and a second eye coordinate x of the right eye in said camera spacere,xre=xr+d·w。
2. The method of anti-glare according to claim 1, wherein the establishing of the rectangular coordinate system of the 3D glasses according to the first center coordinate (x)l,yl,zl) Andthe second center coordinate (x)r,yr,zr) Calculating to obtain a Z-axis direction vector w of the rectangular coordinate system, specifically comprising:
establishing a rectangular coordinate system by taking the center of a connecting line between the center point of the left lens and the center point of the right lens of the 3D glasses as an original point, wherein the direction of the center point of the left lens and the center point of the right lens of the 3D glasses is an X axis, and the plane of the left lens and the plane of the right lens of the 3D glasses is an XY plane;
according to the first central coordinate (x)l,yl,zl) And the second center coordinate (x)r,yr,zr) Calculating to obtain an X-axis direction vector u of the rectangular coordinate system,
determining a Y-axis direction vector v of the rectangular coordinate system, wherein v is (0, 1, 0);
3. The method for preventing dizziness according to claim 2, wherein the obtaining the posture data of the left and right eyes of the user according to the position data of the left and right eyes of the user specifically comprises:
transforming the matrix M according to a predetermined coordinate systemT-WFor the first eye coordinate x respectivelyleAnd the second eye coordinate xrePerforming coordinate transformation to world space to obtain first eye world coordinate xle′,xle′=MT-W*xleAnd a second eye world coordinate xre′,xre′=MT-W*xre;
Respectively transforming the X-axis direction vector u, the Y-axis direction vector v and the Z-axis direction vector w in the rectangular coordinate system to the world space to respectively obtain an X-axis world direction vector u', u ═ MT-WU, Y-axis world direction vector v', v ═ MT-WV, Z-axis world direction vector w', w ═ MT-W*w;
According to the first eye world coordinate x respectivelyle', the second eye world coordinate xre', the X-axis world direction vector u', the Y-axis world direction vector v 'and the Z-axis world direction vector w' are calculated to obtain the posture data M of the left eye of the userlAnd pose data M of the right eyer,
4. The anti-glare method of claim 1, wherein the ratio, length, shape, and pattern of virtual objects within the virtual scene are the same as real objects within a real scene.
5. The method of anti-glare according to claim 1, wherein the virtual scene is located within a preset area of the 3D display centered on a zero disparity plane.
6. An anti-glare system for a desktop virtual reality system, the desktop virtual reality system comprising: 3D glasses and 3D display, characterized in that the anti-glare system comprises:
the first processing unit is used for determining the position data of the left eye and the right eye of the user according to the position data of the center point of the left lens and the position data of the center point of the right lens of the 3D glasses and a preset offset;
the second processing unit is used for obtaining the posture data of the left eye and the right eye of the user according to the position data of the left eye and the right eye of the user;
the transmission unit is used for respectively assigning the posture data of the left eye and the right eye of the user to a left camera and a right camera in a virtual scene in the 3D display so as to update the virtual scene in real time according to the eye position of the user;
the first processing unit is specifically configured to acquire n point coordinates of n preset points on the 3D glasses in the camera space, where n is greater than 2;
respectively calculating according to the space geometric position relation of the n point coordinates to obtain a first center coordinate (x) of the center point of the left lens of the 3D glasses in the camera spacel,yl,zl) And a second center coordinate (x) of the center point of the right lens in the camera spacer,yr,zr);
Establishing a rectangular coordinate system of the 3D glasses according to the first central coordinate (x)l,yl,zl) And the second center coordinate (x)r,yr,zr) Calculating to obtain a Z-axis direction vector w of the rectangular coordinate system;
according to the first central coordinate (x)l,yl,zl) The second center coordinate (x)r,yr,zr) A preset offset d and the vector w in the Z-axis direction are respectively calculated to obtain a first eye coordinate x of the left eye of the user in the camera spacele,xle=xl+ d.w, and a second eye coordinate x of the right eye in said camera spacere,xre=xr+d·w。
7. A terminal, wherein the terminal controls the desktop virtual reality system using the anti-glare method of any one of claims 1 to 5.
8. A storage medium having stored therein instructions which, when read by a computer, cause the computer to execute the anti-halation method according to any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810503485.4A CN108881892B (en) | 2018-05-23 | 2018-05-23 | Anti-dizziness method and system for desktop virtual reality system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810503485.4A CN108881892B (en) | 2018-05-23 | 2018-05-23 | Anti-dizziness method and system for desktop virtual reality system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108881892A CN108881892A (en) | 2018-11-23 |
CN108881892B true CN108881892B (en) | 2021-01-05 |
Family
ID=64334561
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810503485.4A Active CN108881892B (en) | 2018-05-23 | 2018-05-23 | Anti-dizziness method and system for desktop virtual reality system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108881892B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113342173B (en) * | 2021-06-30 | 2022-09-16 | 厦门元馨智能科技有限公司 | Self-adaptive learning method of integrative glasses based on somatosensory operation |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102045577B (en) * | 2010-09-27 | 2013-03-27 | 昆山龙腾光电有限公司 | Observer tracking system and three-dimensional stereo display system for three-dimensional stereo display |
EP2658270A3 (en) * | 2011-05-13 | 2014-02-26 | Lg Electronics Inc. | Apparatus and method for processing 3-dimensional image |
CN102957931A (en) * | 2012-11-02 | 2013-03-06 | 京东方科技集团股份有限公司 | Control method and control device of 3D (three dimensional) display and video glasses |
JP2016051126A (en) * | 2014-09-02 | 2016-04-11 | パナソニックIpマネジメント株式会社 | Head-up display system and virtual image display device |
CN104601980A (en) * | 2014-12-30 | 2015-05-06 | 深圳市亿思达科技集团有限公司 | Glass tracking-based holographic display device, system and method |
CN107147899B (en) * | 2017-06-06 | 2020-02-11 | 北京德火新媒体技术有限公司 | CAVE display system and method adopting LED3D screen |
-
2018
- 2018-05-23 CN CN201810503485.4A patent/CN108881892B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN108881892A (en) | 2018-11-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11182958B2 (en) | Infinite far-field depth perception for near-field objects in virtual environments | |
US20160267720A1 (en) | Pleasant and Realistic Virtual/Augmented/Mixed Reality Experience | |
US11170521B1 (en) | Position estimation based on eye gaze | |
CN108377381A (en) | Immersion VR Video Rendering method and devices | |
JP2017174125A (en) | Information processing apparatus, information processing system, and information processing method | |
WO2010044383A1 (en) | Visual field image display device for eyeglasses and method for displaying visual field image for eyeglasses | |
US11956415B2 (en) | Head mounted display apparatus | |
JP5632245B2 (en) | Eyeglass field image display device | |
CN115868158A (en) | Display adjusting method, device, equipment and medium | |
US20210037225A1 (en) | Method of modifying an image on a computational device | |
CN108881892B (en) | Anti-dizziness method and system for desktop virtual reality system | |
CN111915739A (en) | Real-time three-dimensional panoramic information interactive information system | |
US11353953B2 (en) | Method of modifying an image on a computational device | |
CN111915741A (en) | VR generater based on three-dimensional reconstruction | |
KR101873161B1 (en) | Method and apparatus for providing personal 3-dimensional image using convergence matching algorithm | |
WO2017191703A1 (en) | Image processing device | |
CN115202475A (en) | Display method, display device, electronic equipment and computer-readable storage medium | |
WO2017085803A1 (en) | Video display device and video display method | |
WO2020036114A1 (en) | Image processing device, image processing method, and program | |
CN111915740A (en) | Rapid three-dimensional image acquisition method | |
KR100893381B1 (en) | Methods generating real-time stereo images | |
WO2019138515A1 (en) | Image forming device, eyeglass lens selection system, image forming method, and program | |
CN114859561B (en) | Wearable display device, control method thereof and storage medium | |
WO2024185428A1 (en) | Head-mounted display and image display method | |
Burleigh et al. | Fovography: A naturalistic imaging media |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |