CN116389705B - Three-dimensional scene realization method and system for augmented reality - Google Patents
Three-dimensional scene realization method and system for augmented reality Download PDFInfo
- Publication number
- CN116389705B CN116389705B CN202310377980.6A CN202310377980A CN116389705B CN 116389705 B CN116389705 B CN 116389705B CN 202310377980 A CN202310377980 A CN 202310377980A CN 116389705 B CN116389705 B CN 116389705B
- Authority
- CN
- China
- Prior art keywords
- image
- depth layer
- depth
- images
- focusing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 93
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 53
- 238000005286 illumination Methods 0.000 claims abstract description 115
- 230000005540 biological transmission Effects 0.000 claims abstract description 91
- 230000008569 process Effects 0.000 claims abstract description 30
- 230000003287 optical effect Effects 0.000 claims description 85
- 230000000694 effects Effects 0.000 claims description 38
- 230000008859 change Effects 0.000 claims description 36
- 238000010168 coupling process Methods 0.000 claims description 16
- 238000005859 coupling reaction Methods 0.000 claims description 16
- 230000000630 rising effect Effects 0.000 claims description 16
- 230000008878 coupling Effects 0.000 claims description 10
- 230000004660 morphological change Effects 0.000 claims description 10
- 239000000758 substrate Substances 0.000 claims description 7
- 239000011159 matrix material Substances 0.000 claims description 6
- 238000012545 processing Methods 0.000 claims description 3
- 230000001105 regulatory effect Effects 0.000 claims 1
- 239000004973 liquid crystal related substance Substances 0.000 description 14
- 238000010586 diagram Methods 0.000 description 13
- 238000005516 engineering process Methods 0.000 description 4
- 238000001914 filtration Methods 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 210000004556 brain Anatomy 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000004308 accommodation Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 208000003464 asthenopia Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 208000002173 dizziness Diseases 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 230000005284 excitation Effects 0.000 description 1
- 230000004438 eyesight Effects 0.000 description 1
- 239000005262 ferroelectric liquid crystals (FLCs) Substances 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 230000002688 persistence Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 238000002834 transmittance Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/363—Image reproducers using image projection screens
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02B—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
- Y02B20/00—Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
- Y02B20/40—Control techniques providing energy savings, e.g. smart controller or presence detection
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Processing Or Creating Images (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
Abstract
The invention provides a three-dimensional scene realization method and a system for augmented reality, wherein the method comprises the following steps: in a set time period, utilizing an illumination system to project a binary image of a depth layer forming an image frame to a display system according to a set time sequence; the display system transforms the image frames according to the set time period; the display system sequentially receives and processes the binary images of the depth layers to obtain corresponding depth layer images, and the corresponding depth layer images are projected to the image transmission system; after receiving the corresponding depth layer image, the image transmission system transmits the corresponding depth layer image to the front of the human eye, so that the corresponding depth layer image is overlapped with the real scene; and a corresponding depth layer image transmitted in front of the human eye is projected to a specific depth in space using a focusing system. The method can realize three-dimensional scene display in the AR display equipment by using a chromatographic light field display method, and solve the convergence adjustment conflict of the existing three-dimensional scene display method of the AR display equipment.
Description
Technical Field
The invention belongs to the technical field of augmented reality, and particularly relates to a three-dimensional scene realization method and system for augmented reality.
Background
Augmented reality (Augmented Real ity, AR) is a technology that smartly merges a virtual scene with a real scene, and is an important development direction in the display field. The augmented reality technology is characterized in that virtual scene pictures rendered by a computer and real world pictures are overlapped through special devices such as waveguides and then transmitted into human eyes, virtual information and real information are mutually complemented, and therefore sensory experience exceeding reality is brought to viewers. The augmented reality technology has wide development prospect in the fields of military, medical treatment, industry, education, entertainment and the like.
In order to enhance the sense of realism of virtual scene pictures, the current augmented reality technology has a great deal of effort on how to make people feel three-dimensional. The conventional augmented reality device can only create a pseudo three-dimensional sensation using binocular parallax. The binocular parallax method simulates that images received by two eyes when watching the same scene are different, and two-dimensional images of respective visual angles are respectively transmitted to left eyes and right eyes of an observer through two display devices, so that the two-dimensional images are fused in the brain to form three dimensions. This fusion approach brings about convergence adjustment conflicts (Vergence Accommodat ion Confl ict, VAC). Due to the existence of convergence adjustment conflicts, viewing a traditional waveguide augmented reality system for a long time can lead to asthenopia, and an observer can produce dizziness.
The resolution of depth by the human eye is limited, and when the human eye receives a series of two-dimensional images with similar depth positions in a short time, the two-dimensional images are synthesized into a scene by the brain due to the persistence of vision effect, and the human eye approximately views a three-dimensional object. Therefore, an effective method for resolving the convergence modulation conflict is the tomographic light field display method. The three-dimensional scene of each frame is divided into a limited depth layered image according to the depth by the chromatographic light field display method, the display equipment plays each depth layered image in a time sequence, meanwhile, each depth layered image is endowed with a correct space depth position, and all the depth layered images are received by human eyes in a short time to form a three-dimensional sense. In order to smoothly play video, the number of images displayed per second is greater than the refresh rate of the human eye. The refresh rate of the human eye refers to the number of images that can be recognized by the human eye per second, and typically the speed at which the human eye recognizes consecutive images is 24 frames/second. To improve viewing comfort, current display refresh rates are typically above 60 Hz.
The tomographic light field display method needs to continuously refresh all depth layer images in a period of one frame of image of a conventional display, so that all depth layer images are combined into a complete three-dimensional scene. The refresh rate requirements for the display are therefore extremely high for the tomographic field display method. For example, if the depth layer number of each frame of three-dimensional scene is set to 80 layers and the frame rate of the three-dimensional animation is set to 60 frames per second, the refresh rate of the final display needs to be at least 4800 frames per second, which is far beyond the refresh rate that can be provided by the current display, so the tomographic light field display method has not been applied to the existing AR display device.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides a three-dimensional scene realization method and system for augmented reality.
The invention is realized by the following technical scheme:
the invention provides a three-dimensional scene realization method for augmented reality, which comprises the following steps:
in a set time period, utilizing an illumination system to project a binary image of a depth layer forming an image frame to a display system according to a set time sequence;
the display system transforms the image frames according to the set time period;
the display system sequentially receives the depth layer binary images, processes the corresponding depth layer binary images to obtain corresponding depth layer images, and projects the corresponding depth layer images to the image transmission system;
the image transmission system receives the corresponding depth layer image and transmits the corresponding depth layer image to the front of human eyes so that the corresponding depth layer image is overlapped with a real scene;
the corresponding depth layer image transmitted in front of the human eye is projected to a specific depth in space using a focusing system.
Further, before the illumination system projects the depth layer binary image forming the image frame to the display system according to a set time sequence, the illumination system further includes:
The illumination system receives the depth layer binary image comprising the image frame and stores the depth layer binary image comprising the image frame in a first storage device in the illumination system.
Further, before the display system transforms the image frames according to the set time period, the method further includes:
the display system receives the image frames and stores the image frames in a second storage device in the display system.
Further, the method further comprises the following steps:
storing the image frames forming the three-dimensional scene in a control system;
the control system carries out depth layering on the image frames to obtain a depth layer binary image forming the image frames;
the control system sends the binary image of the depth layer forming the image frame to the illumination system;
the control system sends the image frames to the display system.
Further, the control system performs depth layering on the image frame to obtain a depth layer binary image forming the image frame, including:
acquiring depth information of a scene shown by the image frame;
performing depth layering on the image frame based on the depth information by taking focal power as layering parameters to obtain a depth layered image;
and defining the current depth area of each depth layered image as a value 1, and defining other depth areas as a value 0 to obtain a depth layer binary image forming the image frame.
Further, the display system transforms the image frame according to the set time period, including:
the control system controls the image frames to be displayed on the display system in a changing mode according to the set time period based on the sequence of the image frames forming the three-dimensional scene.
Further, the lighting system projects the binary images of the depth layer forming the image frame to the display system according to a set time sequence in a set time period, and the lighting system comprises:
in a set time period, the control system controls the illumination system to project a binary image of a depth layer forming an image frame to the display system according to a depth relation according to a set time sequence;
the illumination system projects the depth layer binary image to the display system in a binary pixel mode.
Further, the display system sequentially receives the depth layer binary images, processes the corresponding depth layer binary images to obtain corresponding depth layer images, and projects the corresponding depth layer images to an image transmission system, including:
the display system sequentially receives the depth layer binary images, the image pixel positions corresponding to the corresponding depth layer binary images in the image frames are lightened, the display system carries out color modulation on the image pixels to obtain corresponding depth layer images, and the corresponding depth layer images are projected to the image transmission system.
Further, the image transmission system is an optical waveguide, the image transmission system receives the corresponding depth layer image, and transmits the corresponding depth layer image to the front of the human eye, so that the corresponding depth layer image is superimposed with the real scene, and the method includes:
the coupling-in area of the optical waveguide is coupled in the corresponding depth layer image;
the waveguide matrix of the optical waveguide transmits the coupled corresponding depth layer image;
the coupling-out area of the optical waveguide is coupled out of the corresponding depth layer image;
the real scene passes through the waveguide matrix of the optical waveguide and is overlapped with the corresponding depth layer image.
Further, the focusing system projects the corresponding depth layer image transmitted in front of the human eye to a specific depth in space, comprising:
the light divergence angle of the corresponding depth layer image transmitted to the front of the human eye is changed by adjusting the change of the first optical power through the focusing system, and the corresponding depth layer image is projected to a specific depth in space.
Further, the focusing system projects the corresponding depth layer image to a specific depth in space by changing a light divergence angle of the corresponding depth layer image transmitted to the front of the human eye by adjusting a change of the first optical power, comprising:
The control system presets a form change period of first focal power of the focusing system;
the control system sends a first control signal to the focusing system based on the morphological change period of the first focal power;
and after receiving the corresponding first control signal, the focusing system adjusts the first focal power to a corresponding value and projects the corresponding depth layer image to a specific depth in space.
Further, the method further comprises the following steps:
the focusing compensation system compensates the light modulation effect of the focusing system on the real scene by adjusting the change of the second focal power, and counteracts the interference effect of the focusing system on the light of the real scene.
Further, the focusing compensation system compensates the light modulation effect of the focusing system on the real scene by adjusting the change of the second focal power, counteracts the interference of the focusing system on the light of the real scene, and comprises:
the control system presets a form change period of the second focal power of the focusing compensation system;
the control system sends a second control signal to the focusing compensation system based on the form change period of the second focal power;
and after receiving the corresponding second control signal, the focusing compensation system adjusts the second focal power to a corresponding value to counteract the interference of the focusing system on the real scene light.
Further, the second optical power and the first optical power satisfy the following relationship:
wherein ,representing a second optical power, +.>Representing the first optical power, d represents the optical path between the image-side principal plane of the focus compensation system and the object-side principal plane of the focus compensation system.
Further, the duration of the form change period of the first optical power is set corresponding to the duration of the set time period.
Further, the period of the morphological change of the first optical power is a triangular waveform.
Further, the duration corresponding to one triangular waveform is equal to the duration of the set time period.
Further, the time length corresponding to the rising waveform stage or the falling waveform stage of one triangular waveform is equal to the time length of the set time period.
Further, the control system controls the display system to always display the current image frame in the time length corresponding to one triangular waveform;
during the rising waveform duration phase of the triangular waveform,
the control system controls the illumination system to project a part of depth layer binary images forming the current image frame to the display system according to the depth relation of the depth layers from small to large;
the display system sequentially receives the partial depth layer binary images, processes the corresponding depth layer binary images to obtain corresponding depth layer images, projects the corresponding depth layer images to the image transmission system, and transmits the corresponding depth layer images to the front of human eyes through the image transmission system;
The focusing system receives a corresponding first control signal sent by the control system, and projects the corresponding depth layer image transmitted to the front of the human eye to a specific depth in space according to a near-to-far relationship;
the focusing compensation system receives a corresponding second control signal sent by the control system, and counteracts the interference effect of the focusing system on the real scene light;
during the falling waveform duration phase of the triangular waveform,
the control system controls the illumination system to project the binary image of the residual depth layer forming the current image frame to the display system according to the depth relation of the depth layer from small to large according to the set time sequence;
the display system sequentially receives the residual depth layer binary images, processes the corresponding depth layer binary images to obtain corresponding depth layer images, projects the corresponding depth layer images to the image transmission system, and transmits the corresponding depth layer images to the front of human eyes through the image transmission system;
the focusing system receives a corresponding first control signal sent by the control system, and projects the corresponding depth layer image transmitted to the front of the human eye to a specific depth in space according to a far-near relationship;
the focusing compensation system receives a corresponding second control signal sent by the control system, and counteracts the interference effect of the focusing system on the real scene light.
Further, the control system controls the display system to always display the current image frame in the time length corresponding to one triangular waveform;
during the rising waveform duration phase of the triangular waveform,
the control system controls the illumination system to sequentially project the binary images of the depth layers forming the odd sequence of the current image frame to the display system according to the set time sequence;
the display system sequentially receives the depth layer binary images of the odd sequence, processes the corresponding depth layer binary images to obtain corresponding depth layer images, projects the corresponding depth layer images to the image transmission system, and transmits the corresponding depth layer images to the front of human eyes through the image transmission system;
the focusing system receives a corresponding first control signal sent by the control system, and projects the corresponding depth layer image transmitted to the front of the human eye to a specific depth in space according to a near-to-far relationship;
the focusing compensation system receives a corresponding second control signal sent by the control system, and counteracts the interference effect of the focusing system on the real scene light;
during the falling waveform duration phase of the triangular waveform,
the control system controls the illumination system to sequentially project the even sequence of depth layer binary images forming the current image frame to the display system according to the set time sequence;
The display system sequentially receives the even-numbered sequence depth layer binary images, processes the corresponding depth layer binary images to obtain corresponding depth layer images, projects the corresponding depth layer images to the image transmission system, and transmits the corresponding depth layer images to the front of human eyes through the image transmission system;
the focusing system receives a corresponding first control signal sent by the control system, and projects the corresponding depth layer image transmitted to the front of the human eye to a specific depth in space according to a far-near relationship;
the focusing compensation system receives a corresponding second control signal sent by the control system, and counteracts the interference effect of the focusing system on the real scene light.
Further, in the rising waveform duration phase of the triangular waveform,
the control system controls the display system to display the current image frame;
the control system controls the illumination system to project all binary images of depth layers forming the current image frame to the display system according to the depth relation of the depth layers from small to large according to the set time sequence;
the display system sequentially receives all depth layer binary images of the current image frame, processes the corresponding depth layer binary images to obtain corresponding depth layer images, projects the corresponding depth layer images to the image transmission system and transmits the corresponding depth layer images to the front of human eyes through the image transmission system;
The focusing system receives a corresponding first control signal sent by the control system, and projects the corresponding depth layer image transmitted to the front of the human eye to a specific depth in space according to a near-to-far relationship;
the focusing compensation system receives a corresponding second control signal sent by the control system, and counteracts the interference effect of the focusing system on the real scene light;
during the falling waveform duration phase of the triangular waveform,
the control system controls the illumination system not to perform projection operation to the display system or controls the display system to display a full black image.
Further, in the rising waveform duration phase of the triangular waveform,
the control system controls the display system to display the current frame;
the control system controls the illumination system to project all binary images of depth layers forming the current image frame to the display system according to the depth relation of the depth layers from small to large according to the set time sequence;
the display system sequentially receives all depth layer binary images of the current frame, processes the corresponding depth layer binary images to obtain corresponding depth layer images, projects the corresponding depth layer images to the image transmission system and transmits the corresponding depth layer images to the front of human eyes through the image transmission system;
the focusing system receives a corresponding first control signal sent by the control system, and projects the corresponding depth layer image transmitted to the front of the human eye to a specific depth in space according to a near-to-far relationship;
The focusing compensation system receives a corresponding second control signal sent by the control system, and counteracts the interference effect of the focusing system on the real scene light;
during the falling waveform duration phase of the triangular waveform,
the control system controls the display system to display the next image frame;
the control system controls the illumination system to project all the binary images of the depth layers forming the next image frame to the display system according to the depth relation of the depth layers from small to large according to the set time sequence;
the display system sequentially receives all depth layer binary images of the next image frame, processes the corresponding depth layer binary images to obtain corresponding depth layer images, projects the corresponding depth layer images to the image transmission system and transmits the corresponding depth layer images to the front of human eyes through the image transmission system;
the focusing system receives a corresponding first control signal sent by the control system, and projects the corresponding depth layer image transmitted to the front of the human eye to a specific depth in space according to a far-near relationship;
the focusing compensation system receives a corresponding second control signal sent by the control system, and counteracts the interference effect of the focusing system on the real scene light.
Further, before the illumination system projects the depth layer binary image forming the image frame to the display system according to the set time sequence, the illumination system further includes:
The illumination system projects the binary images of the depth layers forming the image frames to the relay system according to a set time sequence, and the binary images are modulated by the relay system and then sent to the display system.
Correspondingly, the invention also provides a three-dimensional scene realization system for augmented reality, which comprises a display system, an illumination system, an image transmission system and a focusing system;
the focusing system is arranged on one side of the image transmission system close to human eyes;
the illumination system is used for projecting the binary images of the depth layers forming the image frames to the display system according to a set time sequence in a set time period;
the display system is used for converting the image frames according to the set time period; sequentially receiving the depth layer binary images, processing the corresponding depth layer binary images to obtain corresponding depth layer images, and projecting the corresponding depth layer images to an image transmission system;
the image transmission system is used for receiving the corresponding depth layer image and transmitting the corresponding depth layer image to the front of the human eyes so that the corresponding depth layer image is overlapped with the real scene;
the focusing system is used for projecting the corresponding depth layer image transmitted to the front of human eyes to a specific depth in space.
Further, the lighting system further comprises a first storage device, wherein the first storage device is arranged in the lighting system;
the illumination system is further configured to receive a depth layer binary image that constitutes the image frame, and store the depth layer binary image that constitutes the image frame in the first storage device.
Further, the display system also comprises a second storage device, wherein the second storage device is arranged in the display system;
the display system is further configured to receive the image frame and store the image frame in the second storage device.
Further, the system also comprises a control system,
the control system is used for:
storing the image frames constituting a three-dimensional scene;
performing depth layering on the image frames to obtain depth layer binary images forming the image frames;
transmitting a depth layer binary image constituting the image frame to an illumination system;
and sending the image frame to a display system.
Further, the image transmission system adopts an optical waveguide, the optical waveguide comprises a waveguide substrate, and a coupling-in area and a coupling-out area are arranged on the waveguide substrate;
a coupling-in area for coupling in the corresponding depth layer image;
the waveguide substrate is used for transmitting the coupled corresponding depth layer image and superposing the corresponding depth layer image and the real scene;
And the coupling-out area is used for coupling out the corresponding depth layer image.
Further, the system also comprises a focusing compensation system, wherein the focusing compensation system is arranged at one side of the image transmission system far away from human eyes;
the focusing compensation system is used for compensating the light modulation effect of the focusing system on the real scene and counteracting the interference effect of the focusing system on the light of the real scene.
Further, the system also comprises a relay system, wherein the relay system is arranged between the illumination system and the display system;
before the illumination system projects the depth layer binary images constituting the image frames to the display system in a set timing,
the relay system is used for receiving the depth layer binary images which are transmitted by the illumination system according to the set time sequence and form the image frames, modulating the received depth layer binary images, and transmitting the modulated depth layer binary images to the display system.
Compared with the prior art, the technical scheme of the invention has the following beneficial effects:
the invention provides a three-dimensional scene realization method for augmented reality, wherein in a set time period, an illumination system projects a depth layer binary image forming an image frame to a display system according to a set time sequence; the display system converts the image frames according to a set time period, sequentially receives and processes the binary images of the depth layers, and obtains corresponding images of the depth layers to be projected to the image transmission system; after receiving the corresponding depth layer image, the image transmission system transmits the corresponding depth layer image to the front of the human eye, so that the corresponding depth layer image is overlapped with the real scene; and the focusing system projects the corresponding depth layer image transmitted in front of the human eye to a specific depth in space. According to the method, the scene of the image frame is divided into a plurality of depth layer binary images, the illumination system is adopted to project the depth layer binary images forming the current image frame to the display system according to the set time sequence in the display time period of each current image frame of the display system, the illumination system is used as an image high refresh rate device to refresh the depth layer binary images, the image refresh rate of the illumination system is equal to the product number of the image frame rate and the depth layering of each image frame, the display system is used as an image low refresh rate device, and the refresh rate of the display system is only required to be larger than the frame rate of the image frame, so that three-dimensional scene display can be realized by using a chromatographic light field display method in AR display equipment, and the radiation and convergence adjustment conflict of the existing AR display equipment three-dimensional scene display method is solved.
Preferably, the illumination device projects the depth layer binary image to the display system in a binary pixel (e.g., 0 and 1) manner, so that the characteristic of high image refresh rate requirement of the illumination system can be satisfied.
In addition, as the scene of the image frame is divided into a plurality of depth layer binary images, the illumination device is adopted to project the corresponding depth layer binary image to the display system in a binary pixel mode, the illumination of the image pixels corresponding to the current depth layer binary image in the image frame displayed by the display system can be realized, and the contrast ratio of the image display of the display system is greatly improved.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, it being obvious that the drawings in the description below are only some embodiments of the invention, and that other drawings can be obtained according to these drawings without inventive faculty for a person skilled in the art.
FIG. 1 is a flow diagram of a three-dimensional scene implementation method for augmented reality according to the present invention;
FIG. 2 is a schematic diagram illustrating depth layering of depth information of a scene shown in an image frame;
FIG. 3 is a schematic view of the focal length relationship between a focusing system and a focus compensation system;
FIG. 4 is a schematic diagram showing a three-dimensional scene composed of two image frames using the augmented reality three-dimensional scene implementation method of the present invention;
FIG. 5 is a schematic diagram showing a three-dimensional scene composed of N (N > 2) image frames using the augmented reality three-dimensional scene implementation method of the present invention;
FIG. 6 is a schematic diagram of a first operational coordination between the control system and the focus system, focus compensation system, illumination system, and display system;
FIG. 7 is a schematic diagram of a second operational coordination between the control system and the focus system, focus compensation system, illumination system, and display system;
FIG. 8 is a schematic diagram of a third operational coordination between the control system and the focus system, focus compensation system, illumination system, and display system;
FIG. 9 is a schematic diagram of the overall structure of a three-dimensional scene implementation system for augmented reality of the present invention;
FIG. 10 is a schematic diagram of a first embodiment of a three-dimensional scene implementation system for augmented reality of the present invention;
FIG. 11 is a schematic diagram of a second embodiment of a three-dimensional scene implementation system for augmented reality of the present invention;
fig. 12 is a schematic diagram of a third embodiment of a three-dimensional scene realization system for augmented reality of the present invention.
The system comprises a 1-control system, a 1-1-computer, a 2-illumination system, a 2-1-digital micromirror device, a 2-2-high refresh rate spatial light modulator, a 3-1-first lens, a 3-2-second lens, a 3-3-filter, a 3-4-half mirror, a 3-4-1-first half mirror, a 3-4-2-second half mirror, a 3-5-collimating lens, a 4-display system, a 4-1-liquid crystal display, a 4-2-LCOS projection device, a 4-3-DMD, a 5-coupling lens, a 6-image transmission system, a 6-1-optical waveguide, a 7-1-focusing system, a 7-1-1-focusing lens, a 7-2-focusing compensation system, a 7-2-1-focusing compensation lens and an 8-human eye.
Detailed Description
The technical solutions of the present invention will be clearly and completely described in conjunction with the embodiments of the present invention, and it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
At present, the image refresh rate of the traditional display device is seriously insufficient, and the requirement of three-dimensional scene display in AR display equipment by using a chromatographic light field display method cannot be met. Meanwhile, the traditional display device has limited brightness and limited contrast ratio, so that the display device can only be used under indoor dark conditions and can not be used under outdoor strong light conditions.
In order to overcome the above drawbacks, the present invention provides a three-dimensional scene implementation method for augmented reality, as shown in fig. 1, the general idea scheme is as follows:
s1: image frames are stored in the control system, where the image frames are image frames that make up a three-dimensional scene.
And the control system performs depth layering on the image frames to obtain depth layer binary images forming the image frames.
The control system sends the image frames to the display system and sends the depth layer binary images constituting the image frames to the illumination system.
S2: the illumination system receives the depth layer binary images that make up the image frames and stores the depth layer binary images that make up the image frames in a first storage device within the illumination system.
The display system receives the image frames and stores the image frames in a second storage device in the display system.
S3: in the course of the set period of time,
the illumination system projects the depth layer binary images constituting the image frames to the display system according to a set timing.
The display system transforms the image frames according to a set time period.
The display system sequentially receives the depth layer binary images, processes the corresponding depth layer binary images to obtain corresponding depth layer images, and projects the corresponding depth layer images to the image transmission system.
S4: the image transmission system receives the corresponding depth layer image and transmits the corresponding depth layer image to the front of the human eyes, so that the corresponding depth layer image is overlapped with the real scene.
S5: the focusing system projects a corresponding depth layer image transmitted in front of the human eye to a specific depth in space.
Meanwhile, the focusing compensation system compensates the light modulation effect of the focusing system on the real scene, and counteracts the interference effect of the focusing system on the light of the real scene.
The control system performs depth layering on the image frames to obtain depth layer binary images forming the image frames, and specifically comprises the following steps:
acquiring depth information of a scene shown by an image frame; the depth information may be obtained by using a conventional method, for example, the depth information may be obtained directly by an optical tracking function of the software.
And carrying out depth layering on the image frames based on the depth information by taking the focal power as layering parameters to obtain a depth layered image.
And defining the current depth area of each depth layered image as a value 1, and defining other depth areas as a value 0 to obtain a depth layer binary image forming the image frame.
Here, since the resolution of the human eye with respect to depth is related to optical power, the resolution is strong for near and weak for far, the above-described resolution is a preferable scheme in terms of optical power as a layering parameter.
As shown in fig. 2, an exemplary schematic diagram of depth layering is shown for depth information of a scene shown in an image frame, the depth of the scene shown in the image frame is 10m, the depth information of the scene is represented by a gray scale map, the higher the gray scale value is, the deeper the depth is, the lower the gray scale value is, the shallower the depth is, the image frame is depth layered by taking 1/7 focal power (D in the figure represents focal power) as a separation interval, 28 depth layered images are obtained, a current depth region of each depth layered image is defined as a value 1, and other depth regions are defined as a value 0, so as to obtain 28 depth layer binary images forming the image frame.
Wherein, in the set time period, the illumination system projects the binary image of the depth layer forming the image frame to the display system according to the set time sequence, and the method specifically comprises the following steps:
in the course of the set period of time,
the control system controls the illumination system to project the binary images of the depth layers forming the image frames to the display system according to the depth relation according to the set time sequence;
the illumination system projects the depth layer binary image to the display system in a binary pixel mode.
In particular, the method comprises the steps of,
in the course of the set period of time,
the control system controls the first storage device to send the binary images of the depth layers forming the image frames to the illumination system according to the depth relation according to the set time sequence;
The illumination system projects binary images of the corresponding depth layers to the display system in a binary pixel mode.
The above "set time period", those skilled in the art can perform conventional setting according to actual needs. The above-mentioned "set time sequence" may be related to the actual needs by those skilled in the art, but needs to correspond to the depth relation of the depth layer binary image.
The display system transforms image frames according to a set time period, and specifically includes:
the control system controls the image frames to be displayed on the display system in a changing mode according to the set time period based on the sequence of the image frames forming the three-dimensional scene.
In particular, the method comprises the steps of,
the control system controls the second storage device to send image frames to the display system based on the sequence of the image frames forming the three-dimensional scene according to the set time period;
and in the set time period, the display system correspondingly displays the current image frame.
The display system sequentially receives the depth layer binary images, processes the corresponding depth layer binary images to obtain corresponding depth layer images, and projects the corresponding depth layer images to an image transmission system, and specifically comprises the following steps:
and the display system sequentially receives the depth layer binary images, the image pixel positions corresponding to the corresponding depth layer binary images in the image frames are lightened, the display system carries out color modulation on the image pixels to obtain corresponding depth layer images, and the corresponding depth layer images are projected to the image transmission system.
Specifically, a liquid crystal display is taken as an example for further explanation:
the liquid crystal display sequentially receives the binary images of the depth layer, the positions of image pixels corresponding to the binary images of the corresponding depth layer in the image frames are lightened (the liquid crystal display changes the arrangement direction of liquid crystal molecules under the action of an electric field to enable the light transmittance of an external light source, namely the received binary images of the depth layer to change to finish electric-light conversion), the display system carries out color modulation on the image pixels (different excitation of R, G, B three primary color signals is utilized, color reproduction of a time domain and a space domain is finished through red, green and blue three primary color filter films) to obtain the images of the corresponding depth layer, and the images of the corresponding depth layer are projected to the image transmission system.
The image transmission system receives the corresponding depth layer image and transmits the corresponding depth layer image to the front of human eyes, so that the corresponding depth layer image is overlapped with a real scene.
The image transmission system takes the optical waveguide as an example, and specifically includes:
the coupling-in area of the optical waveguide is coupled in the corresponding depth layer image;
the waveguide matrix of the optical waveguide transmits the coupled corresponding depth layer image;
The coupling-out area of the optical waveguide is coupled out of the corresponding depth layer image;
the real scene passes through the waveguide matrix of the optical waveguide and is overlapped with the corresponding depth layer image.
Wherein the focusing system projects the corresponding depth layer image transmitted to the front of the human eye to a specific depth in space, comprising:
the focusing system projects the corresponding depth layer image to a specific depth in space by adjusting a change in the first optical power to change a divergence angle of light transmitted to the corresponding depth layer image in front of a human eye.
In particular, the method comprises the steps of,
the control system presets a form change period of first focal power of the focusing system; the period of the change in the form of the first optical power here represents a single change form when the change form of the first optical power with time is repeatedly present.
The control system sends a first control signal to the focusing system based on the morphological change period of the first focal power;
and after receiving the corresponding first control signal, the focusing system adjusts the first focal power to a corresponding value and projects the corresponding depth layer image to a specific depth in space.
The focusing compensation system compensates the light modulation effect of the focusing system on the real scene by adjusting the change of the second focal power, so that the interference of the focusing system on the light of the real scene is counteracted.
More specifically, the method comprises the steps of,
the control system presets a form change period of the second focal power of the focusing compensation system; the period of the change in the form of the second optical power here represents a single change form when the change form of the second optical power with time is repeatedly present.
The control system sends a second control signal to the focusing compensation system based on the form change period of the second focal power;
and after receiving the corresponding second control signal, the focusing compensation system adjusts the second focal power to a corresponding value to counteract the interference of the focusing system on the real scene light.
The period of the form change of the second optical power needs to be set to match the period of the form change of the first optical power, for example, the first optical power always changes in a negative range, and then the second optical power needs to always change in a positive range.
In order to better meet the requirement that the focusing compensation system compensates the light modulation effect of the focusing system on the real scene by adjusting the change of the second focal power, the interference of the focusing system on the light of the real scene is counteracted, the second focal power and the first focal power meet the following relation at the moment, and therefore the positive and negative counteraction of the numerical values of the corresponding moments of the first focal power and the second focal power is guaranteed:
wherein ,/>Representing a second optical power, +.>Representing the first optical power, d represents the optical path between the image-side principal plane of the focus compensation system and the object-side principal plane of the focus compensation system.
As shown in fig. 3, the focusing system and the focusing compensation system constitute a coaxial optical system, H 1 Representing the object principal plane, H ', of the focusing system' 1 Representing the principal plane of the image side, H, of the focusing system 2 Representing the object principal plane, H ', of the focus compensation system' 2 Represents the image-side principal plane of the focus compensation system, d represents the image-side principal plane H 'of the focus compensation system' 2 With object principal plane H of focusing system 1 An optical path therebetween.
The above method of the present invention will be further described by taking the english letter A, B, C, D to be displayed as a three-dimensional scene at different positions in space as an example:
as shown in fig. 4, the three-dimensional scene is composed of two image frames, the first image frame including english alphabets A, B having different depth positions and the second image frame including english alphabets C, D having different depth positions, which are stored in the control system.
The control system performs depth layering on the first image frame to obtain a depth layer binary image A and a depth layer binary image B, and performs depth layering on the second image frame to obtain a depth layer binary image C and a depth layer binary image D.
The control system sequentially sends the first image frame and the second image frame into a second storage device of the display system, and the control system sends the depth layer binary image A, the depth layer binary image B, the depth layer binary image C and the depth layer binary image D into the first storage device of the lighting system.
In the first set period of time,
the control system controls a first image frame comprising English letters A, B with different depth positions to be displayed on the display system;
the control system controls the illumination system to sequentially project the depth layer binary image B and the depth layer binary image A to the display system,
the display system sequentially receives a depth layer binary image B and a depth layer binary image A, firstly, the position of an image pixel corresponding to the depth layer binary image B in an image frame is lightened, and the display system carries out color modulation on the image pixel to obtain the depth layer image B; then, the image pixel position corresponding to the depth layer binary image A in the image frame is lightened, and the display system carries out color modulation on the image pixel to obtain the depth layer image A.
The depth layer image B and the depth layer image A are sequentially transmitted to the front of the human eyes after passing through the image transmission system, the focusing system firstly projects the depth layer image B to a far space position, and then projects the depth layer image A to a near space position.
Similarly, in the next set time period, the second image frame including the english letter C, D having the different depth position performs the above operation, so that the focusing system first projects the depth layer image D to the far spatial position and then projects the depth layer image C to the near spatial position.
From the above description, it can be seen that, within one image frame, the display system displays the current image frame unchanged, and the number of image refreshes of the illumination system is equal to the number of depth layer images of the image frame.
As shown in fig. 5, the three-dimensional scene is composed of N image frames, and when each image frame includes M depth layer images with different depth positions, the illumination system and the display system cooperate to form a schematic diagram. Namely, the image refresh rate of the illumination system and the display system satisfy the following relationship:
FPS illuminan t=N depth ×FPS eyes
wherein FPSilluminant Representing the image refresh rate of the illumination system, N depth Representing depth layer image number, FPS eyes The frame rate at which the human eye receives and synthesizes an image is represented, as is the image refresh rate of the display system.
For example, if the human eye receives and synthesizes images at a frame rate of 60 frames/second, and each image frame is composed of 80 layered images with different depths, the image refresh rate of the lighting device should be 4800Hz or more, and the image refresh rate of the display system should be 60Hz or more.
According to the method, the scene of the image frame is divided into a plurality of depth layer binary images, the illumination system is adopted to project the depth layer binary images forming the current image frame to the display system according to the set time sequence in the display time period of each current image frame of the display system, the illumination system is used as an image high refresh rate device to refresh the depth layer binary images, the image refresh rate of the illumination system is equal to the product number of the image frame rate and the depth layering of each image frame, the display system is used as an image low refresh rate device, and the refresh rate of the display system is only required to be larger than the frame rate of the image frame, so that three-dimensional scene display can be realized by using a chromatographic light field display method in AR display equipment, and the vergence adjustment conflict of the existing AR display equipment three-dimensional scene display method is solved.
Preferably, the illumination device projects the depth layer binary image to the display system in a binary pixel (e.g., 0 and 1) manner, so that the characteristic of high image refresh rate requirement of the illumination system can be satisfied.
Moreover, conventional display systems, such as liquid crystal displays, employ a single backlight or a partitioned backlight to illuminate, the number of backlights being much smaller than the number of liquid crystal pixels. A background light source illuminates a plurality of liquid crystal pixels simultaneously, resulting in some pixels that are still partially light transmissive when not required to emit light, and have low contrast. The invention divides the scene of the image frame into a plurality of depth layer binary images, adopts the illumination device to project the corresponding depth layer binary image to the display system in a binary pixel mode, can realize the illumination of the image pixels corresponding to the current depth layer binary image in the image frame displayed by the display system, and greatly improves the contrast of the image display of the display system.
As a preferred embodiment, in order to better coordinate the operation coordination between the control system and the focusing system, the focusing compensation system, the illumination system and the display system, the duration of the morphological change period of the first optical power is set corresponding to the duration of the set time period.
Wherein the period of the morphological change of the first optical power may be arbitrary.
The duration of the form change period of the first optical power is equal to the duration of the set time period, or the duration of the form change period of the first optical power is N times of the duration of the set time period, and N is an integer greater than 1.
For example, for the form change period of the first optical power to be a triangular waveform, the duration corresponding to the triangular waveform may be equal to the duration of the set time period, or the duration corresponding to the rising waveform stage or the falling waveform stage of the triangular waveform may be equal to the duration of the set time period.
For example, for the case where the period of the morphological change of the first optical power is a triangular waveform, when the duration corresponding to the triangular waveform is equal to the duration of the set time period, the specific operation coordination between the control system and the focusing system, the focusing compensation system, the illumination system and the display system is as follows:
1. Operation of the first embodiment is as shown in fig. 6:
the control system controls the display system to always display the current image frame in the time length corresponding to one triangular waveform;
during the rising waveform duration phase of the triangular waveform,
the control system controls the illumination system to project a part of depth layer binary images forming the current image frame to the display system according to the depth relation of the depth layers from small to large according to the set time sequence;
the display system sequentially receives a part of depth layer binary images, processes the corresponding depth layer binary images to obtain corresponding depth layer images, projects the corresponding depth layer images to the image transmission system, and transmits the corresponding depth layer images to the front of human eyes through the image transmission system;
the focusing system receives a corresponding first control signal sent by the control system, and projects the corresponding depth layer image transmitted to the front of the human eye to a specific depth in space according to a near-to-far relationship;
the focusing compensation system receives a corresponding second control signal sent by the control system, and counteracts the interference effect of the focusing system on the real scene light.
During the falling waveform duration phase of the triangular waveform,
the control system controls the illumination system to project the binary image of the residual depth layer forming the current image frame to the display system according to the depth relation of the depth layer from small to large;
The display system sequentially receives the residual depth layer binary images, processes the corresponding depth layer binary images to obtain corresponding depth layer images, projects the corresponding depth layer images to the image transmission system, and transmits the corresponding depth layer images to the front of human eyes through the image transmission system;
the focusing system receives a corresponding first control signal sent by the control system, and projects the corresponding depth layer image transmitted to the front of the human eyes to a specific depth in space according to a far-near relation.
The focusing compensation system receives a corresponding second control signal sent by the control system, and counteracts the interference effect of the focusing system on the real scene light.
2. Operation matching mode two, as shown in fig. 7:
the control system controls the display system to always display the current image frame in the time length corresponding to one triangular waveform;
during the rising waveform duration phase of the triangular waveform,
the control system controls the illumination system to sequentially project the binary images of the depth layers forming the odd sequence of the current image frame to the display system according to the set time sequence;
the display system sequentially receives the binary images of the depth layers of the odd sequence, processes the binary images of the corresponding depth layers to obtain the images of the corresponding depth layers, projects the images of the corresponding depth layers to the image transmission system, and transmits the images of the corresponding depth layers to the front of human eyes through the image transmission system;
The focusing system receives a corresponding first control signal sent by the control system, and projects the corresponding depth layer image transmitted to the front of the human eye to a specific depth in space according to a near-to-far relationship;
the focusing compensation system receives a corresponding second control signal sent by the control system, and counteracts the interference effect of the focusing system on the real scene light.
During the falling waveform duration phase of the triangular waveform,
the control system controls the illumination system to sequentially project the depth layer binary images forming the even sequence of the current image frame to the display system according to the set time sequence;
the display system sequentially receives even-numbered depth layer binary images, processes the corresponding depth layer binary images to obtain corresponding depth layer images, projects the corresponding depth layer images to the image transmission system, and transmits the corresponding depth layer images to the front of human eyes through the image transmission system;
the focusing system receives a corresponding first control signal sent by the control system, and projects the corresponding depth layer image transmitted to the front of the human eye to a specific depth in space according to a far-near relationship;
the focusing compensation system receives a corresponding second control signal sent by the control system, and counteracts the interference effect of the focusing system on the real scene light.
The above-mentioned idle region between the odd-numbered sequence of depth layer binary images or the idle region between the even-numbered sequence of depth layer binary images, the control system may control the illumination system not to perform the projection operation to the display system or the control system may control the display system to display the full black image, i.e., the black frame insertion operation.
For example, when the period of the morphological change of the first optical power is a triangular waveform, and the duration corresponding to the rising waveform stage or the falling waveform stage of the triangular waveform is equal to the duration of the set time period, the specific operation coordination between the control system and the focusing system, the lighting system and the display system is as follows:
1. operation of the first embodiment is as shown in fig. 8:
during the rising waveform duration phase of the triangular waveform,
the control system controls the display system to display the current image frame;
the control system controls the illumination system to project all binary images of depth layers forming the current image frame to the display system according to the depth relation of the depth layers from small to large according to the set time sequence;
the display system sequentially receives all depth layer binary images of the current image frame, processes the corresponding depth layer binary images to obtain corresponding depth layer images, projects the corresponding depth layer images to the image transmission system and transmits the corresponding depth layer images to the front of human eyes through the image transmission system;
The focusing system receives a corresponding first control signal sent by the control system, and projects the corresponding depth layer image transmitted to the front of the human eye to a specific depth in space according to a near-to-far relationship;
the focusing compensation system receives a corresponding second control signal sent by the control system, and counteracts the interference effect of the focusing system on the real scene light.
During the falling waveform duration phase of the triangular waveform,
the control system controls the illumination system not to perform projection operation to the display system or controls the display system to display a full black image, i.e., insert black frame operation.
Since the focusing operation of the focusing system is continuously performed, it may take a certain time to adjust the focal power of one depth layer to the focal power of another depth layer, and thus a problem of image shake of the depth layer may occur when the duty ratio of the image is 1. Therefore, a black frame can be properly inserted between depth layer images, namely, the images are not displayed when the focusing system changes in focusing, and the images are displayed when the focusing system is stable, so that the image shake can be eliminated.
2. And the second operation matching mode is as follows:
during the rising waveform duration phase of the triangular waveform,
the control system controls the display system to display the current frame;
The control system controls the illumination system to project all binary images of depth layers forming the current image frame to the display system according to the depth relation of the depth layers from small to large according to the set time sequence;
the display system sequentially receives all depth layer binary images of the current frame, processes the corresponding depth layer binary images to obtain corresponding depth layer images, projects the corresponding depth layer images to the image transmission system and transmits the corresponding depth layer images to the front of human eyes through the image transmission system;
the focusing system receives a corresponding first control signal sent by the control system, and projects the corresponding depth layer image transmitted to the front of the human eye to a specific depth in space according to a near-to-far relationship;
the focusing compensation system receives a corresponding second control signal sent by the control system, and counteracts the interference effect of the focusing system on the real scene light.
During the falling waveform duration phase of the triangular waveform,
the control system controls the display system to display the next image frame;
the control system controls the illumination system to project all the binary images of the depth layers forming the next image frame to the display system according to the depth relation of the depth layers from small to large according to the set time sequence;
the display system sequentially receives all depth layer binary images of the next image frame, processes the corresponding depth layer binary images to obtain corresponding depth layer images, projects the corresponding depth layer images to the image transmission system and transmits the corresponding depth layer images to the front of human eyes through the image transmission system;
The focusing system receives a corresponding first control signal sent by the control system, and projects the corresponding depth layer image transmitted to the front of the human eye to a specific depth in space according to a far-near relationship;
the focusing compensation system receives a corresponding second control signal sent by the control system, and counteracts the interference effect of the focusing system on the real scene light.
It should be noted that, the specific operation coordination among the control system, the focusing compensation system, the illumination system and the display system is only an example, the coordination modes among them can be various, the person skilled in the art can perform routine selection according to actual needs, and the duty ratio of the image can also be flexibly adjusted according to actual needs.
In a preferred embodiment, before the illumination system projects the depth layer binary image constituting the image frame to the display system according to a set time sequence, the illumination system further includes:
the illumination system projects the binary images of the depth layers forming the image frames to the relay system according to a set time sequence, and the binary images are modulated by the relay system and then sent to the display system. In particular, "modulating" herein includes operations of changing the propagation direction of light, filtering, adjusting the size of an image, and the like. The light propagation direction is changed to adjust a light path, a reflector or a semi-transparent semi-reflective device is adopted to achieve the light propagation direction, the filtering function is achieved by adopting a filter to filter out higher diffraction orders of light, the image size is adjusted to enable the size of an image frame of a display system to be consistent with the size of a binary image of a depth layer projected by an illumination system, the pixel corresponding relation is ensured, and the lens imaging is achieved, for example, the lens imaging is achieved by adopting a single lens or a lens group.
In response to the above method for implementing a three-dimensional scene for augmented reality, the present invention further provides a system for implementing the above three-dimensional scene for augmented reality, as shown in fig. 9, specifically including a control system 1, a display system 4, an illumination system 2, an image transmission system 6, a focusing system 7-1, a focusing compensation system 7-2, and a relay system (not shown in the figure), wherein a second storage device is disposed in the display system 4, and a first storage device is disposed in the illumination system 2.
wherein ,
the control system 1 is respectively connected with the display system 4, the illumination system 2, the focusing system 7-1 and the focusing compensation system 7-2.
The relay system is arranged between the illumination system 2 and the display system 4.
The display system 4 is adjacent to the image transmission system 6.
The focusing system 7-1 is arranged on the side, close to the human eyes, of the image transmission system 6, and the focusing compensation system 7-2 is arranged on the side, far from the human eyes, of the image transmission system 6.
A control system 1 for performing the following operations:
image frames are stored, wherein the image frames are image frames that make up a particular three-dimensional scene.
And carrying out depth layering on the image frames to obtain a depth layer binary image forming the image frames.
The depth layer binary images constituting the image frames are sent to the illumination system 2, and the illumination system 2 is controlled to project the depth layer binary images to the display system 4 through the relay system according to a set time sequence within a set time period.
The image frames are sent to the display system 4 and the display system 4 is controlled to transform the image frames according to a set time period.
A lighting system 2 for performing the following operations:
the depth layer binary image forming device is used for receiving the depth layer binary image forming the image frame and storing the depth layer binary image forming the image frame in the first storage device.
And in the set time period, under the control of the control system 1, the binary images of the depth layers forming the image frames are projected to the relay system according to the set time sequence.
And the relay system is used for receiving the depth layer binary images which are transmitted by the illumination system 2 according to the set time sequence and form the image frame, carrying out modulation operations such as changing the light propagation direction, filtering, adjusting the image size and the like on the received depth layer binary images, and transmitting the modulated depth layer binary images to the display system 4.
A display system 4 for performing the following operations:
an image frame is received and stored in a second storage device.
Under the control of the control system 1, the image frames are converted according to the set time period, and the display system 4 correspondingly displays the current image frame in the set time period.
And sequentially receiving the depth layer binary images, processing the corresponding depth layer binary images to obtain corresponding depth layer images, and projecting the corresponding depth layer images to the image transmission system 6. Specifically, the image pixel positions corresponding to the binary images of the corresponding depth layers in the image frames are lighted, the display system 4 performs color modulation on the corresponding image pixels to obtain the images of the corresponding depth layers, and the images of the corresponding depth layers are projected to the image transmission system 6.
And the image transmission system 6 is used for receiving the corresponding depth layer image and transmitting the corresponding depth layer image to the front of the human eyes so that the corresponding depth layer image is overlapped with the real scene.
Taking as an example an optical waveguide with coupling-in and coupling-out regions on a waveguide substrate,
and the coupling-in area is used for coupling in the corresponding depth layer image.
The waveguide substrate is used for transmitting the coupled corresponding depth layer image and superposing the corresponding depth layer image and the real scene.
And the coupling-out area is used for coupling out the corresponding depth layer image.
A focusing system 7-1 for projecting the above-mentioned corresponding depth layer image transmitted to the front of the human eye to a specific depth in space.
The focusing compensation system 7-2 is used for compensating the light modulation effect of the focusing system 7-1 on the real scene and counteracting the interference effect of the focusing system 7-1 on the real scene light.
The control system 1 may be a computer or other signal generator, and the connection modes of the control system 1 and the display system 4, the lighting system 2, the focusing system 7-1, the focusing compensation system 7-2 and the like include wired connection modes, wireless connection modes and the like.
The display system 4 modulates the input light to generate corresponding colors, and the display system 4 may be used as an image low refresh rate device by using existing components, including but not limited to a liquid crystal display, a LCOS (Liquid Crystal on Silicon) projection device, a spatial light modulator, and an optical switch.
The above-mentioned illumination system 2 is used as an image high refresh rate device, and each pixel unit of the illumination system 2 only needs to have two states, namely 0/1 or on/off, that is, each pixel unit of the illumination system 2 is binary and is responsible for lighting the corresponding pixel unit of the display system 4. The illumination system 2 may employ existing components including, but not limited to, high refresh rate optical switches, high refresh rate spatial light modulators. The optical switch with high refresh rate, such as digital micro-mirror device (Digital Micromirror Device, DMD) with high refresh rate, uses rotary mirror to control the opening and closing of the optical switch, and has image refresh rate up to tens of thousands Hz, and the spatial light modulator with high refresh rate, such as ferroelectric liquid crystal spatial light modulator, has image refresh rate of hundreds of Hz, and micro-electro-mechanical system (Microelectro Mechanical Systems, MEMS) has image refresh rate up to tens of thousands Hz.
The image transmission system 6 is an augmented reality coupling device in a broad range, and only needs to transmit the corresponding depth layer image projected by the display system to the front of human eyes. The image transmission system 6 includes, but is not limited to, a half-mirror, an array optical waveguide, a diffraction optical waveguide, and the like.
The focal powers of the focusing system 7-1 and the focusing compensation system 7-2 need to meet positive and negative offset of the focal powers at any time, so that the light rays of the external real scene are not influenced by the focusing system. The external real scene light enters human eyes through the combined action of the focusing compensation system 7-2 and the focusing system 7-1, which is equivalent to that the external real scene light is not modulated, so that the human eyes are not different from naked eyes when watching the external real scene through the system.
The focusing system 7-1 and the focusing compensation system 7-2 may be conventional components, and may be conventional focusing lenses, diffractive optical elements, or the like. Such as EL-16-40 from Optotune, and diffractive optical elements such as Moire Lens from Diffratec.
The relay system may be a combination of conventional components, such as a combination of lenses and filters.
The positional relationship between the illumination system 2, the display system 4 and the image transmission system 6 is not particularly limited, and may be close, remote, inclined, vertical, or the like. The illumination system 2 and the display system 4 may be on the same side as the human eye in the image transmission system 6 or on different sides.
The setting gaps of the illumination system 2, the display system 4, the image transmission system 6, the focusing system 7-1, the focusing compensation system 7-2 and the like are not particularly required, and can be closely attached or can be a certain distance from each other.
The three-dimensional scene realization system for augmented reality of the present invention is described below with reference to specific embodiments.
Example 1
As shown in fig. 10, the three-dimensional scene realization system for augmented reality of the present embodiment includes a computer 1-1, a Digital Micromirror Device (DMD) 2-1, a relay system, a liquid crystal display 4-1, an in-coupling lens 5, an optical waveguide 6-1, a focusing lens 7-1-1, and a focusing compensation lens 7-2-1, wherein the relay system includes a filter 3-3 and an optical 4f system composed of a first lens 3-1 and a second lens 3-2, the filter 3-3 is disposed between the first lens 3-1 and the second lens 3-2, and the first lens 3-1 is close to the Digital Micromirror Device (DMD) 2-1.
The computer 1-1 is respectively connected with the digital micro-mirror device 2-1, the liquid crystal display 4-1, the focusing lens 7-1-1 and the focusing compensation lens 7-2-1, the relay system is arranged between the digital micro-mirror device 2-1 and the liquid crystal display 4-1, the coupling lens 5 is arranged between the liquid crystal display 4-1 and the optical waveguide 6-1, the focusing lens 7-1-1 is arranged on one side of the optical waveguide 6-1 close to the human eye, and the focusing compensation lens 7-2-1 is arranged on one side of the optical waveguide 6-1 far away from the human eye 8.
Example 2
As shown in fig. 11, the three-dimensional scene realization system for augmented reality of the present embodiment includes a computer 1-1, a Digital Micromirror Device (DMD) 2-1, a relay system, an LCOS projection device 4-2, an in-coupling lens 5, an optical waveguide 6-1, a focusing lens 7-1-1, and a focusing compensation lens 7-2-1, wherein the relay system includes a filter 3-3, an optical 4f system composed of a first lens 3-1 and a second lens 3-2, and a half mirror 3-4, the filter 3-3 is disposed between the first lens 3-1 and the second lens 3-2, the first lens 3-1 is close to the Digital Micromirror Device (DMD) 2-1, and the half mirror 3-4 is close to the second lens 3-2.
The computer 1-1 is respectively connected with the digital micro-mirror device 2-1, the LCOS projection device 4-2, the focusing lens 7-1-1 and the focusing compensation lens 7-2-1, the relay system is arranged between the digital micro-mirror device 2-1 and the LCOS projection device 4-2, the coupling lens 5 is arranged between the LCOS projection device 4-2 and the optical waveguide 6-1, the focusing lens 7-1-1 is arranged on one side of the optical waveguide 6-1 close to the human eye, and the focusing compensation lens 7-2-1 is arranged on one side of the optical waveguide 6-1 far away from the human eye 8.
Example 3
As shown in fig. 12, the three-dimensional scene realization system for augmented reality of the present embodiment includes a computer 1-1, a spatial light modulator 2-2 with a high refresh rate, a relay system, a DMD4-3, an in-coupling lens 5, an optical waveguide 6-1, a focusing lens 7-1-1, and a focusing compensation lens 7-2-1, wherein the relay system includes a collimating lens 3-5, a filter 3-3, an optical 4f system composed of a first lens 3-1 and a second lens 3-2, and a first half mirror 3-4-1 and a second half mirror 3-4-2.
The computer 1-1 is respectively connected with the high refresh rate spatial light modulator 2-2, the DMD4-3, the focusing lens 7-1-1 and the focusing compensation lens 7-2-1, the relay system is arranged in front of the DMD4-3, the coupling lens 5 is arranged between the DMD4-3 and the optical waveguide 6-1, the focusing lens 7-1-1 is arranged on one side of the optical waveguide 6-1 close to the human eye, and the focusing compensation lens 7-2-1 is arranged on one side of the optical waveguide 6-1 far away from the human eye 8.
The collimating lens 3-5 here is used to process the incident light of the spatial modulator with high refresh rate, for example, the collimating lens is used to adjust the spherical wave to a plane wave and then to enter the spatial modulator 2-2 with high refresh rate.
The above embodiments are only for illustrating the technical solution of the present invention and not for limiting the same, and although the present invention has been described in detail with reference to the above embodiments, one skilled in the art may make modifications and equivalents to the specific embodiments of the present invention, and any modifications and equivalents not departing from the spirit and scope of the present invention are within the scope of the claims of the present invention.
Claims (27)
1. A three-dimensional scene realization method for augmented reality, comprising:
storing image frames forming a three-dimensional scene in a control system;
the control system performs depth layering on the image frame to obtain a depth layer binary image forming the image frame, and the control system comprises the following steps: acquiring depth information of a scene shown by the image frame; performing depth layering on the image frame based on the depth information by taking focal power as layering parameters to obtain a depth layered image; defining the current depth area of each depth layered image as a value 1, and defining other depth areas as a value 0 to obtain a depth layer binary image forming the image frame;
the control system sends the binary image of the depth layer forming the image frame to the illumination system;
the control system sends the image frames to a display system;
in a set time period, utilizing an illumination system to project a binary image of a depth layer forming an image frame to a display system according to a set time sequence;
the display system transforms the image frames according to the set time period;
the display system sequentially receives the depth layer binary images, processes the corresponding depth layer binary images to obtain corresponding depth layer images, and projects the corresponding depth layer images to the image transmission system;
The image transmission system receives the corresponding depth layer image and transmits the corresponding depth layer image to the front of human eyes so that the corresponding depth layer image is overlapped with a real scene;
the corresponding depth layer image transmitted in front of the human eye is projected to a specific depth in space using a focusing system.
2. The three-dimensional scene realization method for augmented reality according to claim 1, wherein before the illumination system projects the depth layer binary image constituting the image frame to a display system in a set timing, further comprising:
the illumination system receives the depth layer binary image comprising the image frame and stores the depth layer binary image comprising the image frame in a first storage device in the illumination system.
3. The three-dimensional scene realization method for augmented reality according to claim 2, further comprising, before the display system transforms the image frames according to the set time period:
the display system receives the image frames and stores the image frames in a second storage device in the display system.
4. The three-dimensional scene realization method for augmented reality according to claim 1, wherein the display system transforms the image frames according to the set time period, comprising:
The control system controls the image frames to be displayed on the display system in a changing mode according to the set time period based on the sequence of the image frames forming the three-dimensional scene.
5. The method according to claim 4, wherein the illuminating system projects the depth layer binary image constituting the image frame to the display system at a set timing within a set time period, comprising:
in a set time period, the control system controls the illumination system to project a binary image of a depth layer forming an image frame to the display system according to a depth relation according to a set time sequence;
the illumination system projects the depth layer binary image to the display system in a binary pixel mode.
6. The method for implementing a three-dimensional scene for augmented reality according to claim 1, wherein the displaying system sequentially receives the depth layer binary images, processes the corresponding depth layer binary images to obtain corresponding depth layer images, and projects the corresponding depth layer images to an image transmission system, and comprises:
the display system sequentially receives the depth layer binary images, the image pixel positions corresponding to the corresponding depth layer binary images in the image frames are lightened, the display system carries out color modulation on the image pixels to obtain corresponding depth layer images, and the corresponding depth layer images are projected to the image transmission system.
7. The three-dimensional scene realization method for augmented reality according to claim 1, wherein the image transmission system is an optical waveguide, the image transmission system receives the corresponding depth layer image, transmits the corresponding depth layer image to the front of the human eye such that the corresponding depth layer image is superimposed with a real scene, comprising:
the coupling-in area of the optical waveguide is coupled in the corresponding depth layer image;
the waveguide matrix of the optical waveguide transmits the coupled corresponding depth layer image;
the coupling-out area of the optical waveguide is coupled out of the corresponding depth layer image;
the real scene passes through the waveguide matrix of the optical waveguide and is overlapped with the corresponding depth layer image.
8. The three-dimensional scene realization method for augmented reality according to claim 5, wherein the focusing system projects the corresponding depth layer image transmitted to the front of the human eye to a specific depth in space, comprising:
the focusing system projects the corresponding depth layer image to a specific depth in space by adjusting a change in the first optical power to change a divergence angle of light transmitted to the corresponding depth layer image in front of a human eye.
9. The three-dimensional scene realization method for augmented reality according to claim 8, wherein the focusing system changes a ray divergence angle of the corresponding depth layer image transmitted to the front of the human eye by adjusting a change in first optical power, projects the corresponding depth layer image to a specific depth in space, comprising:
The control system presets a form change period of first focal power of the focusing system;
the control system sends a first control signal to the focusing system based on the morphological change period of the first focal power;
and after receiving the corresponding first control signal, the focusing system adjusts the first focal power to a corresponding value and projects the corresponding depth layer image to a specific depth in space.
10. The three-dimensional scene realization method for augmented reality according to claim 9, further comprising:
and the change of the second focal power is regulated by the focusing compensation system to compensate the light modulation effect of the focusing system on the real scene, so as to offset the interference effect of the focusing system on the real scene light.
11. The method of claim 10, wherein the focus compensation system compensates for light modulation effects of the focus system on the real scene by adjusting the change in the second optical power to cancel interference of the focus system on the real scene light, comprising:
the control system presets a form change period of the second focal power of the focusing compensation system;
the control system sends a second control signal to the focusing compensation system based on the form change period of the second focal power;
And after receiving the corresponding second control signal, the focusing compensation system adjusts the second focal power to a corresponding value to counteract the interference of the focusing system on the real scene light.
12. The three-dimensional scene realization method for augmented reality according to claim 10, wherein the second optical power and the first optical power moment satisfy the following relationship:
wherein ,representing a second optical power, +.>Representing the first optical power, d represents the optical path between the image-side principal plane of the focus compensation system and the object-side principal plane of the focus compensation system.
13. The three-dimensional scene realization method for augmented reality according to claim 11, wherein the duration of the morphological change period of the first optical power is set corresponding to the duration of the set time period.
14. The three-dimensional scene realization method for augmented reality according to claim 13, wherein the period of the morphological change of the first optical power is a triangular waveform.
15. The method of claim 14, wherein a duration corresponding to one triangle waveform is equal to a duration of the set time period.
16. The method according to claim 14, wherein a time length corresponding to a rising waveform phase or a falling waveform phase of one triangle waveform is equal to a time length of the set time period.
17. The method for three-dimensional scene realization for augmented reality according to claim 15, wherein,
the control system controls the display system to always display the current image frame in the time length corresponding to one triangular waveform;
during the rising waveform duration phase of the triangular waveform,
the control system controls the illumination system to project a part of depth layer binary images forming the current image frame to the display system according to the depth relation of the depth layers from small to large;
the display system sequentially receives the partial depth layer binary images, processes the corresponding depth layer binary images to obtain corresponding depth layer images, projects the corresponding depth layer images to the image transmission system, and transmits the corresponding depth layer images to the front of human eyes through the image transmission system;
the focusing system receives a corresponding first control signal sent by the control system, and projects the corresponding depth layer image transmitted to the front of the human eye to a specific depth in space according to a near-to-far relationship;
the focusing compensation system receives a corresponding second control signal sent by the control system, and counteracts the interference effect of the focusing system on the real scene light;
during the falling waveform duration phase of the triangular waveform,
The control system controls the illumination system to project the binary image of the residual depth layer forming the current image frame to the display system according to the depth relation of the depth layer from small to large according to the set time sequence;
the display system sequentially receives the residual depth layer binary images, processes the corresponding depth layer binary images to obtain corresponding depth layer images, projects the corresponding depth layer images to the image transmission system, and transmits the corresponding depth layer images to the front of human eyes through the image transmission system;
the focusing system receives a corresponding first control signal sent by the control system, and projects the corresponding depth layer image transmitted to the front of the human eye to a specific depth in space according to a far-near relationship;
the focusing compensation system receives a corresponding second control signal sent by the control system, and counteracts the interference effect of the focusing system on the real scene light.
18. The method for three-dimensional scene realization for augmented reality according to claim 15, wherein,
the control system controls the display system to always display the current image frame in the time length corresponding to one triangular waveform;
during the rising waveform duration phase of the triangular waveform,
the control system controls the illumination system to sequentially project the binary images of the depth layers forming the odd sequence of the current image frame to the display system according to the set time sequence;
The display system sequentially receives the depth layer binary images of the odd sequence, processes the corresponding depth layer binary images to obtain corresponding depth layer images, projects the corresponding depth layer images to the image transmission system, and transmits the corresponding depth layer images to the front of human eyes through the image transmission system;
the focusing system receives a corresponding first control signal sent by the control system, and projects the corresponding depth layer image transmitted to the front of the human eye to a specific depth in space according to a near-to-far relationship;
the focusing compensation system receives a corresponding second control signal sent by the control system, and counteracts the interference effect of the focusing system on the real scene light;
during the falling waveform duration phase of the triangular waveform,
the control system controls the illumination system to sequentially project the even sequence of depth layer binary images forming the current image frame to the display system according to the set time sequence;
the display system sequentially receives the even-numbered sequence depth layer binary images, processes the corresponding depth layer binary images to obtain corresponding depth layer images, projects the corresponding depth layer images to the image transmission system, and transmits the corresponding depth layer images to the front of human eyes through the image transmission system;
the focusing system receives a corresponding first control signal sent by the control system, and projects the corresponding depth layer image transmitted to the front of the human eye to a specific depth in space according to a far-near relationship;
The focusing compensation system receives a corresponding second control signal sent by the control system, and counteracts the interference effect of the focusing system on the real scene light.
19. The method of claim 16, wherein the three-dimensional scene is augmented reality,
during the rising waveform duration phase of the triangular waveform,
the control system controls the display system to display the current image frame;
the control system controls the illumination system to project all binary images of depth layers forming the current image frame to the display system according to the depth relation of the depth layers from small to large according to the set time sequence;
the display system sequentially receives all depth layer binary images of the current image frame, processes the corresponding depth layer binary images to obtain corresponding depth layer images, projects the corresponding depth layer images to the image transmission system and transmits the corresponding depth layer images to the front of human eyes through the image transmission system;
the focusing system receives a corresponding first control signal sent by the control system, and projects the corresponding depth layer image transmitted to the front of the human eye to a specific depth in space according to a near-to-far relationship;
the focusing compensation system receives a corresponding second control signal sent by the control system, and counteracts the interference effect of the focusing system on the real scene light;
During the falling waveform duration phase of the triangular waveform,
the control system controls the illumination system not to perform projection operation to the display system or controls the display system to display a full black image.
20. The method of claim 16, wherein the three-dimensional scene is augmented reality,
during the rising waveform duration phase of the triangular waveform,
the control system controls the display system to display the current frame;
the control system controls the illumination system to project all binary images of depth layers forming the current image frame to the display system according to the depth relation of the depth layers from small to large according to the set time sequence;
the display system sequentially receives all depth layer binary images of the current frame, processes the corresponding depth layer binary images to obtain corresponding depth layer images, projects the corresponding depth layer images to the image transmission system and transmits the corresponding depth layer images to the front of human eyes through the image transmission system;
the focusing system receives a corresponding first control signal sent by the control system, and projects the corresponding depth layer image transmitted to the front of the human eye to a specific depth in space according to a near-to-far relationship;
the focusing compensation system receives a corresponding second control signal sent by the control system, and counteracts the interference effect of the focusing system on the real scene light;
During the falling waveform duration phase of the triangular waveform,
the control system controls the display system to display the next image frame;
the control system controls the illumination system to project all the binary images of the depth layers forming the next image frame to the display system according to the depth relation of the depth layers from small to large according to the set time sequence;
the display system sequentially receives all depth layer binary images of the next image frame, processes the corresponding depth layer binary images to obtain corresponding depth layer images, projects the corresponding depth layer images to the image transmission system and transmits the corresponding depth layer images to the front of human eyes through the image transmission system;
the focusing system receives a corresponding first control signal sent by the control system, and projects the corresponding depth layer image transmitted to the front of the human eye to a specific depth in space according to a far-near relationship;
the focusing compensation system receives a corresponding second control signal sent by the control system, and counteracts the interference effect of the focusing system on the real scene light.
21. The method according to claim 1, wherein before the illumination system projects the depth layer binary image constituting the image frame to the display system in a set timing, the method further comprises:
The illumination system projects the binary images of the depth layers forming the image frames to the relay system according to a set time sequence, and the binary images are modulated by the relay system and then sent to the display system.
22. A three-dimensional scene realization system for augmented reality is characterized by comprising a control system, a display system, an illumination system, an image transmission system and a focusing system;
the focusing system is arranged on one side of the image transmission system close to human eyes;
the control system is used for:
storing the image frames constituting a three-dimensional scene; performing depth layering on the image frames to obtain depth layer binary images forming the image frames; transmitting a depth layer binary image constituting the image frame to an illumination system; transmitting the image frame to a display system;
the method for obtaining the depth layer binary image of the image frame comprises the following steps: acquiring depth information of a scene shown by the image frame; performing depth layering on the image frame based on the depth information by taking focal power as layering parameters to obtain a depth layered image; defining the current depth area of each depth layered image as a value 1, and defining other depth areas as a value 0 to obtain a depth layer binary image forming the image frame;
The illumination system is used for projecting the binary images of the depth layers forming the image frames to the display system according to a set time sequence in a set time period;
the display system is used for converting the image frames according to the set time period; sequentially receiving the depth layer binary images, processing the corresponding depth layer binary images to obtain corresponding depth layer images, and projecting the corresponding depth layer images to an image transmission system;
the image transmission system is used for receiving the corresponding depth layer image and transmitting the corresponding depth layer image to the front of the human eyes so that the corresponding depth layer image is overlapped with the real scene;
the focusing system is used for projecting the corresponding depth layer image transmitted to the front of human eyes to a specific depth in space.
23. The three-dimensional scene realization system for augmented reality according to claim 22, further comprising a first storage device disposed within the illumination system;
the illumination system is further configured to receive a depth layer binary image that constitutes the image frame, and store the depth layer binary image that constitutes the image frame in the first storage device.
24. The three-dimensional scene realization system for augmented reality according to claim 23, further comprising a second storage device disposed within the display system;
the display system is further configured to receive the image frame and store the image frame in the second storage device.
25. The three-dimensional scene realization system for augmented reality according to claim 22, wherein the image transmission system employs an optical waveguide comprising a waveguide substrate having an in-coupling region and an out-coupling region disposed thereon;
a coupling-in area for coupling in the corresponding depth layer image;
the waveguide substrate is used for transmitting the coupled corresponding depth layer image and superposing the corresponding depth layer image and the real scene;
and the coupling-out area is used for coupling out the corresponding depth layer image.
26. The three-dimensional scene realization system for augmented reality according to claim 22, further comprising a focus compensation system disposed on a side of the image transmission system remote from the human eye;
the focusing compensation system is used for compensating the light modulation effect of the focusing system on the real scene and counteracting the interference effect of the focusing system on the light of the real scene.
27. The three-dimensional scene realization system for augmented reality according to claim 22, further comprising a relay system disposed between the illumination system and the display system;
before the illumination system projects the depth layer binary images constituting the image frames to the display system in a set timing,
the relay system is used for receiving the depth layer binary images which are transmitted by the illumination system according to the set time sequence and form the image frames, modulating the received depth layer binary images, and transmitting the modulated depth layer binary images to the display system.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2023102918731 | 2023-03-23 | ||
CN202310291873 | 2023-03-23 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116389705A CN116389705A (en) | 2023-07-04 |
CN116389705B true CN116389705B (en) | 2023-10-13 |
Family
ID=86961365
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310377980.6A Active CN116389705B (en) | 2023-03-23 | 2023-04-10 | Three-dimensional scene realization method and system for augmented reality |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116389705B (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106204587A (en) * | 2016-05-27 | 2016-12-07 | 孔德兴 | Multiple organ dividing method based on degree of depth convolutional neural networks and region-competitive model |
CN112396703A (en) * | 2020-11-18 | 2021-02-23 | 北京工商大学 | Single-image three-dimensional point cloud model reconstruction method |
KR102236917B1 (en) * | 2019-12-24 | 2021-04-06 | 주식회사 파코웨어 | Three dimensional modeling method using three dimensional depth map and region segmentation based on an single camear for assembly type education apparatus with polyomino |
CN112802186A (en) * | 2021-01-27 | 2021-05-14 | 清华大学 | Dynamic scene real-time three-dimensional reconstruction method based on binarization characteristic coding matching |
WO2022152708A1 (en) * | 2021-01-14 | 2022-07-21 | Interdigital Ce Patent Holdings, Sas | A method and apparatus for generating adaptive multiplane images |
-
2023
- 2023-04-10 CN CN202310377980.6A patent/CN116389705B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106204587A (en) * | 2016-05-27 | 2016-12-07 | 孔德兴 | Multiple organ dividing method based on degree of depth convolutional neural networks and region-competitive model |
KR102236917B1 (en) * | 2019-12-24 | 2021-04-06 | 주식회사 파코웨어 | Three dimensional modeling method using three dimensional depth map and region segmentation based on an single camear for assembly type education apparatus with polyomino |
CN112396703A (en) * | 2020-11-18 | 2021-02-23 | 北京工商大学 | Single-image three-dimensional point cloud model reconstruction method |
WO2022152708A1 (en) * | 2021-01-14 | 2022-07-21 | Interdigital Ce Patent Holdings, Sas | A method and apparatus for generating adaptive multiplane images |
CN112802186A (en) * | 2021-01-27 | 2021-05-14 | 清华大学 | Dynamic scene real-time three-dimensional reconstruction method based on binarization characteristic coding matching |
Non-Patent Citations (1)
Title |
---|
葛启杰.《基于结构光的三维场景重建方法研究》.《硕士学位论文》.2019,全文. * |
Also Published As
Publication number | Publication date |
---|---|
CN116389705A (en) | 2023-07-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA2284915C (en) | Autostereoscopic projection system | |
US20100073376A1 (en) | Electronic imaging device and method of electronically rendering a wavefront | |
CN107894666A (en) | A kind of more depth stereo image display systems of wear-type and display methods | |
TWI531215B (en) | Coded illuminator and light field projection device using the same | |
CN107329256A (en) | Display device and its control method | |
JPH06342128A (en) | Stereoscopic picture display method and picture display device | |
JP2004062209A (en) | Monocentric autostereoscopic optical display device having expanded color gamut | |
WO2018117736A1 (en) | Apparatus for displaying holographic images and method of controlling the same | |
US9591293B2 (en) | Stereoscopic field sequential colour display control | |
US20230045982A1 (en) | Shuttered Light Field Display | |
CN102566250B (en) | A kind of optical projection system of naked-eye auto-stereoscopic display and display | |
EP3997510A1 (en) | Image display system, method of operating image display system and image projecting device | |
US20230077212A1 (en) | Display apparatus, system, and method | |
GB2570903A (en) | Projection array light field display | |
CN108848370B (en) | Naked eye 3D display device with switchable full parallax and display method thereof | |
JPH0738825A (en) | Spectacle type display device | |
CN109154737B (en) | Dynamic full three-dimensional display | |
US20120105807A1 (en) | Laser projector compatible with wavelength multiplexing passive filter techniques for stereoscopic 3D | |
JPH08334730A (en) | Stereoscopic picture reproducing device | |
CN116389705B (en) | Three-dimensional scene realization method and system for augmented reality | |
CN115576116B (en) | Image generation device, display equipment and image generation method | |
CN112970247A (en) | System and method for displaying multiple depth-of-field images | |
JP2010511913A (en) | Modulator device and apparatus for three-dimensional display system | |
CN107333122A (en) | Stereoscopic display play system and method are drawn a kind of projection more | |
CA2537692C (en) | 3-dimensional steroscopic display systems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |