CN102306401A - Left/right-eye three-dimensional picture drawing method for three-dimensional (3D) virtual scene containing fuzzy reflection effect - Google Patents
Left/right-eye three-dimensional picture drawing method for three-dimensional (3D) virtual scene containing fuzzy reflection effect Download PDFInfo
- Publication number
- CN102306401A CN102306401A CN201110225275A CN201110225275A CN102306401A CN 102306401 A CN102306401 A CN 102306401A CN 201110225275 A CN201110225275 A CN 201110225275A CN 201110225275 A CN201110225275 A CN 201110225275A CN 102306401 A CN102306401 A CN 102306401A
- Authority
- CN
- China
- Prior art keywords
- incident radiation
- radiation brightness
- intersection point
- hshbf
- virtual camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- Image Generation (AREA)
Abstract
The invention discloses a left/right-eye three-dimensional picture drawing method for a three-dimensional (3D) virtual scene containing a fuzzy reflection effect, and belongs to the technical field of drawing of 3D virtual scenes with sense of reality. The traditional left/right-eye three-dimensional picture drawing method for the 3D virtual scene adopts a mode of respectively and independently drawing image pictures of a left-eye virtual camera and a right-eye virtual camera to draw a three-dimensional picture. According to the method, drawing programs for the image pictures of the left-eye virtual camera and the right-eye virtual camera are respectively and simultaneously executed on a dual-core central processing unit (CPU), the image picture of the left-eye virtual camera is drawn from a first-row pixel, and the image picture of the right-eye virtual camera is drawn from a last-low pixel, so the reusability of calculation results of incidence radiant brightness during drawing of the image pictures of the left-eye virtual camera and the right-eye virtual camera are ensured. By an interpolation method based on gradient, the calculation results of the incidence radiant brightness are reused during drawing. Compared with the traditional method, the invention has the advantages that: the speed of drawing the 3D virtual scene is increased by 20 to 35 percent.
Description
Technical field
The invention belongs to sense of reality 3D virtual scene rendering technique field, relate to a kind of right and left eyes stereoscopic picture plane method for drafting that comprises the 3D virtual scene of fuzzy reflecting effect.
Background technology
The three-dimensional animation film is as emerging in recent years computer art, and growth momentum is very swift and violent, has obtained using widely in many industries.In order to produce stronger visual impact, at present a lot of three-dimensional animation films all have stereoscopic visual effect.The ultimate principle of stereoscopic visual effect is, spectators are when watching animated film, and right and left eyes can be seen the image frame that the right and left eyes virtual camera is taken respectively without interfering with each other, and synthetic through human brain finally produces going into screen or going out to shield effect of picture again.
In order to improve the sense of reality of three-dimensional animation mute, often require in the image of drawing, to add the global illumination effect.The key that realizes the global illumination effect is, correctly simulates different objects surface in the 3D virtual scene to the reflection of light process.In 3 D image drawing, common illumination reflection type has: direct reflection (Specular Reflection), diffuse reflection (Diffuse Reflection) and fuzzy reflection (Glossy Reflection).In fact direct reflection and diffuse reflection can regard the special case of fuzzy reflection as.Usually use bidirectional reflectance distribution function (BRDF) to come the modeling body surface to the reflection of light characteristic.People such as Pascal Gautron in 2004 propose to represent that with the harmonious basis function of hemisphere hemisphere territory function (Hemispherical Functions) (sees the paper " A Novel Hemispherical Basis for Accurate and Efficient Rendering " that Eurographics Symposium on Rendering meeting paper that European graphics association in 2004 publishes is concentrated; Author Pascal Gautron; Jaroslav Krivanek; Sumanta Pattanaik, Kadi Bouatouch).Because physical quantitys such as the BRDF of the body surface in the 3D virtual scene and incident radiation brightness in fact all belong to hemisphere territory function, therefore can represent with the harmonious basis function of hemisphere.
Traditional 3D virtual scene stereoscopic picture plane method for drafting adopts independent respectively mode of drawing the image frame of right and left eyes virtual camera to realize the drafting of stereoscopic picture plane.Fuzzy reflection is a kind of common illumination reflection type in the sense of reality 3D virtual scene.Yet because the outgoing illumination value and the viewing angle of fuzzy reflection are closely related, and the same point in the 3D virtual scene is also unequal usually with respect to the viewing angle of right and left eyes virtual camera.Therefore; The illumination value that same point on the body surface in the 3D virtual scene is shown in the pixel of the image frame of right and left eyes virtual camera equates usually and not exclusively, and this makes the fuzzy reflected light of image frame of left eye virtual camera be difficult to directly multiplexing in the image frame of right eye virtual camera according to result of calculation.The present invention utilizes the harmonious basis function of hemisphere to represent the BRDF and the incident radiation brightness of the body surface in the 3D virtual scene; Realize reusing of existing result of calculation through incident radiation brightness buffer memory, so that improve the speed that the right and left eyes stereoscopic picture plane of the 3D virtual scene that comprises fuzzy reflecting effect is drawn.
Summary of the invention
The object of the present invention is to provide a kind of right and left eyes stereoscopic picture plane method for drafting that comprises the 3D virtual scene of fuzzy reflecting effect.The equipment that the inventive method needs comprises: the computer system that has double-core CPU and shared drive.As shown in Figure 1, can photograph lip-deep some p of the object (105) in the 3D virtual scene simultaneously when left eye virtual camera (101) and right eye virtual camera (102)
1, the some p
2With a p
3The time, some p
1, the some p
2With a p
3Illumination value will be kept at respectively the picture plane (103) of left eye virtual camera (101) and looking like in the pixel on the plane (104) of right eye virtual camera (102).At first with bidirectional reflectance distribution function (the Bi-directional Reflectance Distribution Function of diffuse reflection in the 3D virtual scene and fuzzy reflecting surface; Be abbreviated as BRDF) be launched into the harmonious basis function of low order hemisphere (Hemispherical Harmonics Basis Function; Be abbreviated as HSHBF) add up and form, the coefficient of each rank HSHBF be kept in the shared drive and with this surperficial BRDF be associated.On double-core CPU, draw the image frame of left eye virtual camera (101) and the image frame of right eye virtual camera (102) then simultaneously; With the incident radiation brightness of diffuse reflection and fuzzy reflecting surface point be launched into adding up of low order HSHBF and form; And be kept in the incident radiation brightness buffer memory; Incident radiation brightness interpolating through based on gradient calculates; Realize reusing of incident radiation brightness calculation result, thereby improve the render speed of the right and left eyes stereoscopic picture plane of 3D virtual scene.
The inventive method provides a kind of incident radiation brightness record association structure, and it comprises member variables such as location components, local coordinate system component, HSHBF expansion coefficient component, HSHBF expansion coefficient translation gradient component.
The inventive method is kept at the incident radiation brightness buffer memory that is arranged in shared drive with incident radiation brightness record item; For the ease of retrieving the incident radiation brightness record item in the incident radiation brightness buffer memory according to the locus and in incident radiation brightness buffer memory, adding new incident radiation brightness record item, incident radiation brightness buffer memory uses three dimensions uniform grid data structure; According to given three-dimensional space position, can directly retrieve all the incident radiation brightness record items in the corresponding grid, and in grid, add new incident radiation brightness record item.
The low order HSHBF expansion coefficient of all diffuse reflections in the calculating 3D of the first virtual scene of the inventive method and the BRDF of fuzzy reflecting surface, concrete steps are following:
Step S101: minimum HSHBF exponent number n is set
l, the highest HSHBF exponent number n
hError threshold E with the HSHBF expansion
t
Step S102: regard diffuse reflection the special case of fuzzy reflection as, for the BRDF (A101) of each diffuse reflection in the 3D virtual scene and fuzzy reflecting surface, mark BRDF (A101) can be expressed as adding up of low order HSHBF and form, make n
v=n
l, do following calculating:
1. align the hemisphere space by cosine distribution and carry out the space angle sampling, in each angle sampling
(θ is wherein calculated at the place
oBe the polar angle component,
Be azimuthal component):
L=0,1 ..., n
v, m=-l ..., l, wherein
For BRDF (A101) in emergence angle does
Incident angle does
The time value,
Be a harmonious basis function of hemisphere,
For BRDF (A101) samples in angle
Go up with
The HSHBF expansion coefficient that is associated;
2. calculate
If
And n
v<n
h, n then
v=n
v+ 1, change 1.; If
And n
v=n
h, then mark BRDF (A101) can not be expressed as adding up of low order HSHBF and form; If
Then in shared drive, preserve all HSHBF expansion coefficients
And with itself and BRDF (A101) and angle sampling
Be associated.
The second portion of the inventive method is drawn the image frame of right and left eyes virtual camera concurrently on the computer system that has double-core CPU and shared drive; Through the incident radiation brightness value that has calculated being carried out buffer memory and utilizing interpolation method based on gradient; Realization incident radiation brightness calculation result's is multiplexing; With the render speed of the image frame that improves the right and left eyes virtual camera, concrete steps are following:
Step S201: empty the incident radiation brightness buffer memory in the shared drive;
Step S202: the drawing program of the image frame of operation left eye virtual camera on first CPU calculating inner core, the drawing program of the image frame of operation right eye virtual camera on second CPU calculating inner core simultaneously; The drawing program execution in step S203 of the image frame of left eye virtual camera, the drawing program execution in step S204 of the image frame of right eye virtual camera;
Step S203: make I=1, J=1, carry out as follows and calculate:
1. whether capable, the J row pixel emission chief ray (B01) of the I on the picture plane of left eye virtual camera from left eye virtual camera photocentre position is tested chief ray (B01) and is intersected with the surface of 3D virtual scene; If non-intersect, then the color of pixel that chief ray (B01) is corresponding is set to the background illumination value, changes 7., otherwise calculates nearest position of intersecting point p
i, utilize monte carlo method that light source is sampled, calculate intersection point p
iThe direct illumination value at place;
If 2. intersection point p
iThe reflection type on surface, place is a direct reflection, and then recurrence is followed the tracks of the direct reflection secondary light ray, calculates the indirect illumination value of direct reflection; If intersection point p
iThe reflection type on surface, place is diffuse reflection, then changes 3.; If intersection point p
iThe reflection type on surface, place is fuzzy reflection, then inquiry intersection point p in shared drive
iThe BRDF on place surface whether can be expressed as adding up of low order HSHBF and form, if can, then change 3., otherwise use monte carlo method antinode p
iLocal coordinate system (U
i, V
i, N
i) positive N
iImportance sampling (N is carried out in the hemisphere space
iWith intersection point p
iNormal direction in the same way), directly calculate the indirect illumination value of fuzzy reflection, change 6.;
3. incident radiation brightness buffer memory is carried out memory access and add latching operation; From incident radiation brightness buffer memory, search the incident radiation brightness record (B02) that satisfies following condition:
Condition A:
Wherein p is the location components member variable in the incident radiation brightness record (B02), and n is the normal line vector at the represented surface point place of the location components member variable of incident radiation brightness record (B02), n
iBe intersection point p
iUnit normal vector, R
iBe intersection point p
iTo the cllipsoidal harmonics mean distance of all viewable objects, a is given in advance precision threshold; Incident radiation brightness buffer memory is carried out the memory access unlocking operation;
4. a S set put in the satisfy condition incident radiation brightness record (B02) of A of all that will find, if 5. the S non-NULL then changes, otherwise utilize monte carlo method antinode p
iLocal coordinate system (U
i, V
i, N
i) positive N
iThe space angle sampling is carried out in the hemisphere space, calculates the incident radiation brightness value on each angle sample direction
K representes the angle sample number, calculates the HSHBF expansion coefficient of incident radiation brightness according to them
And translation gradient
L=0 wherein, 1 ..., n
v, m=-l ..., l, N are the angle hits; Create a new incident radiation brightness record item, its location components member variable assignment is p
i, local coordinate system component member variable assignment is (U
i, V
i, N
i), HSHBF expansion coefficient component member variable assignment does
HSHBF expansion coefficient translation gradient component member variable assignment does
Incident radiation brightness buffer memory is carried out memory access add latching operation; With this incident radiation brightness record item add in the incident radiation brightness buffer memory with position p
iIn the corresponding grid cell, simultaneously it is added in the S set; Incident radiation brightness buffer memory is carried out the memory access unlocking operation;
5. according to the local coordinate system component member variable of each element in the S set, rotate each element local coordinate system in case with intersection point p
iLocal coordinate system alignment, utilize interpolation method, according to the record of the incident radiation brightness in a S set interpolation calculation intersection point p based on gradient
iThe HSHBF expansion coefficient of incident radiation brightness
Calculate chief ray (B01) at intersection point p
iLocal coordinate system (U
i, V
i, N
i) positive N
iPairing angle in the hemisphere space
Will
Value as intersection point p
iThe indirect illumination value of fuzzy reflection, wherein
Be intersection point p
iThe HSHBF expansion coefficient of the BRDF on surface, place;
6. with intersection point p
iThe direct illumination value at place and the indirect illumination value addition of reflection are as intersection point p
iFinal illumination value result of calculation;
7. if J=J+1 is J>N
Pix, J=1 then, I=I+1, wherein N
PixThe pixel columns of the image frame of expression left eye virtual camera;
If 8. I<=M
Pix, M wherein
Pix1. the number of lines of pixels of the image frame of expression left eye virtual camera then changes, otherwise changes step S205;
Step S204: make I '=M
Pix, J '=N
Pix, M wherein
PixThe number of lines of pixels of the image frame of expression right eye virtual camera, N
PixThe pixel columns of the image frame of expression right eye virtual camera, carry out as follows and calculate:
1. whether I ' the row on the picture plane of right eye virtual camera, J ' row pixel emission chief ray (B03) from right eye virtual camera photocentre position are tested chief ray (B03) and are intersected with the surface of 3D virtual scene; If non-intersect, then the color of pixel that chief ray (B03) is corresponding is set to the background illumination value, changes 7., otherwise calculates nearest position of intersecting point p '
i, utilize monte carlo method that light source is sampled, calculate intersection point p '
iThe direct illumination value at place;
If 2. intersection point p '
iThe reflection type on surface, place is a direct reflection, and then recurrence is followed the tracks of the direct reflection secondary light ray, calculates the indirect illumination value of direct reflection; If intersection point p '
iThe reflection type on surface, place is diffuse reflection, then changes 3.; If intersection point p '
iThe reflection type on surface, place is fuzzy reflection, then inquiry intersection point p ' in shared drive
iThe BRDF on place surface whether can be expressed as adding up of low order HSHBF and form, if can, then change 3., otherwise use monte carlo method antinode p '
iLocal coordinate system (U '
i, V '
i, N '
i) positive N '
iThe hemisphere space carry out importance sampling (N '
iWith intersection point p '
iNormal direction in the same way), directly calculate the indirect illumination value of fuzzy reflection, change 6.;
3. incident radiation brightness buffer memory is carried out memory access and add latching operation; From incident radiation brightness buffer memory, search the incident radiation brightness record (B04) that satisfies following condition:
Condition A ':
P ' is the location components member variable in the incident radiation brightness record (B04), and n ' is the normal line vector at the represented surface point place of the location components member variable of incident radiation brightness record (B04), n '
iBe intersection point p '
iUnit normal vector, R
iBe intersection point p '
iTo the cllipsoidal harmonics mean distance of all viewable objects, a is given in advance precision threshold; Incident radiation brightness buffer memory is carried out the memory access unlocking operation;
4. all that will find satisfy condition the incident radiation brightness record (B04) of A ' put into a S set ', if 5. S ' non-NULL then changes, otherwise utilizes monte carlo method antinode p '
iLocal coordinate system (U '
i, V '
i, N '
i) positive N '
iThe space angle sampling is carried out in the hemisphere space, calculates the incident radiation brightness value on each angle sample direction
K representes the angle sample number, calculates the HSHBF expansion coefficient of incident radiation brightness according to them
And translation gradient
L=0 wherein, 1 ..., n
v, m=-l ..., l, N are the angle hits; Create a new incident radiation brightness record item, its location components member variable assignment is p '
i, local coordinate system component member variable assignment be (U '
i, V '
i, N '
i), HSHBF expansion coefficient component member variable assignment does
HSHBF expansion coefficient translation gradient component member variable assignment does
Incident radiation brightness buffer memory is carried out memory access add latching operation; With this incident radiation brightness record item add in the incident radiation brightness buffer memory with position p '
iIn the corresponding grid cell, simultaneously it is added set, in; Incident radiation brightness buffer memory is carried out the memory access unlocking operation;
5. according to S set ' in the local coordinate system component member variable of each element, rotate each element local coordinate system in case with intersection point p '
iLocal coordinate system alignment, utilize interpolation method based on gradient, according to S set ' in an incident radiation brightness record interpolation calculation intersection point p '
iThe HSHBF expansion coefficient of incident radiation brightness
Calculate chief ray (B03) at intersection point p '
iLocal coordinate system (U '
i, V '
i, N '
i) positive N '
iPairing angle in the hemisphere space
Will
Value as intersection point p '
iThe indirect illumination value of fuzzy reflection, wherein
Be intersection point p '
iThe HSHBF expansion coefficient of the BRDF on surface, place;
6. with intersection point p '
iThe direct illumination value at place and the indirect illumination value addition of reflection are as intersection point p '
iFinal illumination value result of calculation;
7. if J '=J '-1 is J '<1, then J '=N
Pix, I '=I '-1;
If 8. 1. then change I ' >=1, otherwise change step S205;
Step S205: the image frame of right and left eyes virtual camera is drawn and is finished.
Beneficial effect
The invention provides a kind of right and left eyes stereoscopic picture plane method for drafting that comprises the 3D virtual scene of fuzzy reflecting effect.Owing to use double-core CPU to draw the image frame of right and left eyes virtual camera simultaneously; And adopt the image frame of left eye virtual camera to draw since the first row pixel; The drawing order that the image frame of right eye virtual camera begins to draw from last column pixel has further guaranteed the incident radiation brightness calculation result's in the image frame drawing process of right and left eyes virtual camera reusability; Through interpolation method based on gradient, in drawing process, reused incident radiation brightness calculation result, therefore reduced the T.T. of drafting 3 D virtual scene right and left eyes stereoscopic picture plane.Utilize the inventive method drafting 3 D virtual scene stereoscopic picture plane than classic method fast 20%~35%.
Description of drawings
Fig. 1 is the imaging synoptic diagram of point on the picture plane of right and left eyes virtual camera on the body surface in the 3D virtual scene.
Embodiment
In order to make the features and advantages of the present invention clearer, the present invention is done further description below in conjunction with specific embodiment.
In the present embodiment, select Intel (R) Xeon
TMDouble-core CPU realizes the exclusive reference of two CPU calculating inner core to shared drive through the method that locks.Incident radiation brightness buffer memory is arranged in the shared drive; The image frame drawing program of right and left eyes virtual camera all is stored in the incident radiation brightness results that calculates in the incident radiation brightness buffer memory, and carries out interpolation calculation from the incident radiation brightness of incident radiation brightness cache lookup point of proximity.Therefore the incident radiation brightness calculation result of the image frame drawing program of left eye virtual camera can directly be used by the image frame drawing program of right eye virtual camera; In like manner, the incident radiation brightness calculation result of the image frame drawing program of right eye virtual camera also can directly be used by the image frame drawing program of left eye virtual camera.
The object of the present invention is to provide a kind of right and left eyes stereoscopic picture plane method for drafting that comprises the 3D virtual scene of fuzzy reflecting effect.The equipment that the inventive method needs comprises: the computer system that has double-core CPU and shared drive.As shown in Figure 1, can photograph lip-deep some p of the object (105) in the 3D virtual scene simultaneously when left eye virtual camera (101) and right eye virtual camera (102)
1, the some p
2With a p
3The time, some p
1, the some p
2With a p
3Illumination value will be kept at respectively the picture plane (103) of left eye virtual camera (101) and looking like in the pixel on the plane (104) of right eye virtual camera (102).At first with bidirectional reflectance distribution function (the Bi-directional Reflectance Distribution Function of diffuse reflection in the 3D virtual scene and fuzzy reflecting surface; Be abbreviated as BRDF) be launched into the harmonious basis function of low order hemisphere (Hemispherical Harmonics Basis Function; Be abbreviated as HSHBF) add up and form, the coefficient of each rank HSHBF be kept in the shared drive and with this surperficial BRDF be associated.On double-core CPU, draw the image frame of left eye virtual camera (101) and the image frame of right eye virtual camera (102) then simultaneously; With the incident radiation brightness of diffuse reflection and fuzzy reflecting surface point be launched into adding up of low order HSHBF and form; And be kept in the incident radiation brightness buffer memory; Incident radiation brightness interpolating through based on gradient calculates; Realize reusing of incident radiation brightness calculation result, thereby improve the render speed of the right and left eyes stereoscopic picture plane of 3D virtual scene.
The inventive method provides a kind of incident radiation brightness record association structure, and it comprises member variables such as location components, local coordinate system component, HSHBF expansion coefficient component, HSHBF expansion coefficient translation gradient component.
The inventive method is kept at the incident radiation brightness buffer memory that is arranged in shared drive with incident radiation brightness record item; For the ease of retrieving the incident radiation brightness record item in the incident radiation brightness buffer memory according to the locus and in incident radiation brightness buffer memory, adding new incident radiation brightness record item, incident radiation brightness buffer memory uses three dimensions uniform grid data structure; According to given three-dimensional space position, can directly retrieve all the incident radiation brightness record items in the corresponding grid, and in grid, add new incident radiation brightness record item.
The low order HSHBF expansion coefficient of all diffuse reflections in the calculating 3D of the first virtual scene of the inventive method and the BRDF of fuzzy reflecting surface, concrete steps are following:
Step S101: minimum HSHBF exponent number n is set
l, the highest HSHBF exponent number n
hError threshold E with the HSHBF expansion
t
Step S102: regard diffuse reflection the special case of fuzzy reflection as, for the BRDF (A101) of each diffuse reflection in the 3D virtual scene and fuzzy reflecting surface, mark BRDF (A101) can be expressed as adding up of low order HSHBF and form, make n
v=n
l, do following calculating:
1. align the hemisphere space by cosine distribution and carry out the space angle sampling, in each angle sampling
(θ is wherein calculated at the place
oBe the polar angle component,
Be azimuthal component):
L=0,1 ..., n
v, m=-l ..., l, wherein
For BRDF (A101) in emergence angle does
Incident angle does
The time value,
Be a harmonious basis function of hemisphere,
For BRDF (A101) samples in angle
Go up with
The HSHBF expansion coefficient that is associated;
2. calculate
If
And n
v<n
h, n then
v=n
v+ 1, change 1.; If
And n
v=n
h, then mark BRDF (A101) can not be expressed as adding up of low order HSHBF and form; If
Then in shared drive, preserve all HSHBF expansion coefficients
And with itself and BRDF (A101) and angle sampling
Be associated.
The second portion of the inventive method is drawn the image frame of right and left eyes virtual camera concurrently on the computer system that has double-core CPU and shared drive; Through the incident radiation brightness value that has calculated being carried out buffer memory and utilizing interpolation method based on gradient; Realization incident radiation brightness calculation result's is multiplexing; With the render speed of the image frame that improves the right and left eyes virtual camera, concrete steps are following:
Step S201: empty the incident radiation brightness buffer memory in the shared drive;
Step S202: the drawing program of the image frame of operation left eye virtual camera on first CPU calculating inner core, the drawing program of the image frame of operation right eye virtual camera on second CPU calculating inner core simultaneously; The drawing program execution in step S203 of the image frame of left eye virtual camera, the drawing program execution in step S204 of the image frame of right eye virtual camera;
Step S203: make I=1, J=1, carry out as follows and calculate:
1. whether capable, the J row pixel emission chief ray (B01) of the I on the picture plane of left eye virtual camera from left eye virtual camera photocentre position is tested chief ray (B01) and is intersected with the surface of 3D virtual scene; If non-intersect, then the color of pixel that chief ray (B01) is corresponding is set to the background illumination value, changes 7., otherwise calculates nearest position of intersecting point p
i, utilize monte carlo method that light source is sampled, calculate intersection point p
iThe direct illumination value at place;
If 2. intersection point p
iThe reflection type on surface, place is a direct reflection, and then recurrence is followed the tracks of the direct reflection secondary light ray, calculates the indirect illumination value of direct reflection; If intersection point p
iThe reflection type on surface, place is diffuse reflection, then changes 3.; If intersection point p
iThe reflection type on surface, place is fuzzy reflection, then inquiry intersection point p in shared drive
iThe BRDF on place surface whether can be expressed as adding up of low order HSHBF and form, if can, then change 3., otherwise use monte carlo method antinode p
iLocal coordinate system (U
i, V
i, N
i) positive N
iImportance sampling (N is carried out in the hemisphere space
iWith intersection point p
iNormal direction in the same way), directly calculate the indirect illumination value of fuzzy reflection, change 6.;
3. incident radiation brightness buffer memory is carried out memory access and add latching operation; From incident radiation brightness buffer memory, search the incident radiation brightness record (B02) that satisfies following condition:
Condition A:
Wherein p is the location components member variable in the incident radiation brightness record (B02), and n is the normal line vector at the represented surface point place of the location components member variable of incident radiation brightness record (B02), n
iBe intersection point p
iUnit normal vector, R
iBe intersection point p
iTo the cllipsoidal harmonics mean distance of all viewable objects, a is given in advance precision threshold; Incident radiation brightness buffer memory is carried out the memory access unlocking operation;
4. a S set put in the satisfy condition incident radiation brightness record (B02) of A of all that will find, if 5. the S non-NULL then changes, otherwise utilize monte carlo method antinode p
iLocal coordinate system (U
i, V
i, N
i) positive N
iThe space angle sampling is carried out in the hemisphere space, calculates the incident radiation brightness value on each angle sample direction
K representes the angle sample number, calculates the HSHBF expansion coefficient of incident radiation brightness according to them
And translation gradient
L=0 wherein, 1 ..., n
v, m=-l ..., l, N are the angle hits; Create a new incident radiation brightness record item, its location components member variable assignment is p
i, local coordinate system component member variable assignment is (U
i, V
i, N
i), HSHBF expansion coefficient component member variable assignment does
HSHBF expansion coefficient translation gradient component member variable assignment does
Incident radiation brightness buffer memory is carried out memory access add latching operation; With this incident radiation brightness record item add in the incident radiation brightness buffer memory with position p
iIn the corresponding grid cell, simultaneously it is added in the S set; Incident radiation brightness buffer memory is carried out the memory access unlocking operation;
5. according to the local coordinate system component member variable of each element in the S set, rotate each element local coordinate system in case with intersection point p
iLocal coordinate system alignment, utilize interpolation method, according to the record of the incident radiation brightness in a S set interpolation calculation intersection point p based on gradient
iThe HSHBF expansion coefficient of incident radiation brightness
Calculate chief ray (B01) at intersection point p
iLocal coordinate system (U
i, V
i, N
i) positive N
iPairing angle in the hemisphere space
Will
Value as intersection point p
iThe indirect illumination value of fuzzy reflection, wherein
Be intersection point p
iThe HSHBF expansion coefficient of the BRDF on surface, place;
6. with intersection point p
iThe direct illumination value at place and the indirect illumination value addition of reflection are as intersection point p
iFinal illumination value result of calculation;
7. if J=J+1 is J>N
Pix, J=1 then, I=I+1, wherein N
PixThe pixel columns of the image frame of expression left eye virtual camera;
If 8. I<=M
Pix, M wherein
Pix1. the number of lines of pixels of the image frame of expression left eye virtual camera then changes, otherwise changes step S205;
Step S204: make I '=M
Pix, J '=N
Pix, M wherein
PixThe number of lines of pixels of the image frame of expression right eye virtual camera, N
PixThe pixel columns of the image frame of expression right eye virtual camera, carry out as follows and calculate:
1. whether I ' the row on the picture plane of right eye virtual camera, J ' row pixel emission chief ray (B03) from right eye virtual camera photocentre position are tested chief ray (B03) and are intersected with the surface of 3D virtual scene; If non-intersect, then the color of pixel that chief ray (B03) is corresponding is set to the background illumination value, changes 7., otherwise calculates nearest position of intersecting point p '
i, utilize monte carlo method that light source is sampled, calculate intersection point p '
iThe direct illumination value at place;
If 2. intersection point p '
iThe reflection type on surface, place is a direct reflection, and then recurrence is followed the tracks of the direct reflection secondary light ray, calculates the indirect illumination value of direct reflection; If intersection point p '
iThe reflection type on surface, place is diffuse reflection, then changes 3.; If intersection point p '
iThe reflection type on surface, place is fuzzy reflection, then inquiry intersection point p ' in shared drive
iThe BRDF on place surface whether can be expressed as adding up of low order HSHBF and form, if can, then change 3., otherwise use monte carlo method antinode p '
iLocal coordinate system (U '
i, V '
i, N '
i) positive N '
iThe hemisphere space carry out importance sampling (N '
iWith intersection point p '
iNormal direction in the same way), directly calculate the indirect illumination value of fuzzy reflection, change 6.;
3. incident radiation brightness buffer memory is carried out memory access and add latching operation; From incident radiation brightness buffer memory, search the incident radiation brightness record (B04) that satisfies following condition:
Condition A ':
P ' is the location components member variable in the incident radiation brightness record (B04), and n ' is the normal line vector at the represented surface point place of the location components member variable of incident radiation brightness record (B04), n '
iBe intersection point p '
iUnit normal vector, R
iBe intersection point p '
iTo the cllipsoidal harmonics mean distance of all viewable objects, a is given in advance precision threshold; Incident radiation brightness buffer memory is carried out the memory access unlocking operation;
4. all that will find satisfy condition the incident radiation brightness record (B04) of A ' put into a S set ', if 5. S ' non-NULL then changes, otherwise utilizes monte carlo method antinode p '
iLocal coordinate system (U '
i, V '
i, N '
i) positive N '
iThe space angle sampling is carried out in the hemisphere space, calculates the incident radiation brightness value on each angle sample direction
K representes the angle sample number, calculates the HSHBF expansion coefficient of incident radiation brightness according to them
And translation gradient
L=0 wherein, 1 ..., n
v, m=-l ..., l, N are the angle hits; Create a new incident radiation brightness record item, its location components member variable assignment is p '
i, local coordinate system component member variable assignment be (U '
i, V '
i, N '
i), HSHBF expansion coefficient component member variable assignment does
HSHBF expansion coefficient translation gradient component member variable assignment does
Incident radiation brightness buffer memory is carried out memory access add latching operation; With this incident radiation brightness record item add in the incident radiation brightness buffer memory with position p '
iIn the corresponding grid cell, simultaneously it is added S set ' in; Incident radiation brightness buffer memory is carried out the memory access unlocking operation;
5. according to S set ' in the local coordinate system component member variable of each element, rotate each element local coordinate system in case with intersection point p '
iLocal coordinate system alignment, utilize interpolation method based on gradient, according to S set ' in an incident radiation brightness record interpolation calculation intersection point p '
iThe HSHBF expansion coefficient of incident radiation brightness
Calculate chief ray (B03) at intersection point p '
iLocal coordinate system (U '
i, V '
i, N '
i) positive N '
iPairing angle in the hemisphere space
Will
Value as intersection point p '
iThe indirect illumination value of fuzzy reflection, wherein
Be intersection point p '
iThe HSHBF expansion coefficient of the BRDF on surface, place;
6. with intersection point p '
iThe direct illumination value at place and the indirect illumination value addition of reflection are as intersection point p '
iFinal illumination value result of calculation;
7. if J '=J '-1 is J '<1, then J '=N
Pix, I '=I '-1;
If 8. 1. then change I ' >=1, otherwise change step S205;
Step S205: the image frame of right and left eyes virtual camera is drawn and is finished.
Claims (1)
1. right and left eyes stereoscopic picture plane method for drafting that comprises the 3D virtual scene of fuzzy reflecting effect is characterized in that required equipment, data structure and performing step are following:
The object of the present invention is to provide a kind of right and left eyes stereoscopic picture plane method for drafting that comprises the 3D virtual scene of fuzzy reflecting effect; The equipment that the inventive method needs comprises: the computer system that has double-core CPU and shared drive; When left eye virtual camera (101) and right eye virtual camera (102) can photograph lip-deep some p of the object (105) in the 3D virtual scene simultaneously
1, the some p
2With a p
3The time, some p
1, the some p
2With a p
3Illumination value will be kept at respectively the picture plane (103) of left eye virtual camera (101) and looking like in the pixel on the plane (104) of right eye virtual camera (102); At first with bidirectional reflectance distribution function (the Bi-directional Reflectance Distribution Function of diffuse reflection in the 3D virtual scene and fuzzy reflecting surface; Be abbreviated as BRDF) be launched into the harmonious basis function of low order hemisphere (Hemispherical Harmonics Basis Function; Be abbreviated as HSHBF) add up and form, the coefficient of each rank HSHBF be kept in the shared drive and with this surperficial BRDF be associated; On double-core CPU, draw the image frame of left eye virtual camera (101) and the image frame of right eye virtual camera (102) then simultaneously; With the incident radiation brightness of diffuse reflection and fuzzy reflecting surface point be launched into adding up of low order HSHBF and form; And be kept in the incident radiation brightness buffer memory; Incident radiation brightness interpolating through based on gradient calculates; Realize reusing of incident radiation brightness calculation result, thereby improve the render speed of the right and left eyes stereoscopic picture plane of 3D virtual scene;
The inventive method provides a kind of incident radiation brightness record association structure, and it comprises member variables such as location components, local coordinate system component, HSHBF expansion coefficient component, HSHBF expansion coefficient translation gradient component;
The inventive method is kept at the incident radiation brightness buffer memory that is arranged in shared drive with incident radiation brightness record item; For the ease of retrieving the incident radiation brightness record item in the incident radiation brightness buffer memory according to the locus and in incident radiation brightness buffer memory, adding new incident radiation brightness record item, incident radiation brightness buffer memory uses three dimensions uniform grid data structure; According to given three-dimensional space position, can directly retrieve all the incident radiation brightness record items in the corresponding grid, and in grid, add new incident radiation brightness record item;
The low order HSHBF expansion coefficient of all diffuse reflections in the calculating 3D of the first virtual scene of the inventive method and the BRDF of fuzzy reflecting surface, concrete steps are following:
Step S101: minimum HSHBF exponent number n is set
l, the highest HSHBF exponent number n
hError threshold E with the HSHBF expansion
t
Step S102: regard diffuse reflection the special case of fuzzy reflection as, for the BRDF (A101) of each diffuse reflection in the 3D virtual scene and fuzzy reflecting surface, mark BRDF (A101) can be expressed as adding up of low order HSHBF and form, make n
v=n
l, do following calculating:
1. align the hemisphere space by cosine distribution and carry out the space angle sampling, in each angle sampling
(θ is wherein calculated at the place
oBe the polar angle component,
Be azimuthal component):
L=0,1 ..., n
v, m=-l ..., l, wherein
For BRDF (A101) in emergence angle does
Incident angle does
The time value,
Be a harmonious basis function of hemisphere,
For BRDF (A101) samples in angle
Go up with
The HSHBF expansion coefficient that is associated;
2. calculate
If
And n
v<n
h, n then
v=n
v+ 1, change 1.; If
And n
v=n
h, then mark BRDF (A101) can not be expressed as adding up of low order HSHBF and form; If
Then in shared drive, preserve all HSHBF expansion coefficients
And with itself and BRDF (A101) and angle sampling
Be associated;
The second portion of the inventive method is drawn the image frame of right and left eyes virtual camera concurrently on the computer system that has double-core CPU and shared drive; Through the incident radiation brightness value that has calculated being carried out buffer memory and utilizing interpolation method based on gradient; Realization incident radiation brightness calculation result's is multiplexing; With the render speed of the image frame that improves the right and left eyes virtual camera, concrete steps are following:
Step S201: empty the incident radiation brightness buffer memory in the shared drive;
Step S202: the drawing program of the image frame of operation left eye virtual camera on first CPU calculating inner core, the drawing program of the image frame of operation right eye virtual camera on second CPU calculating inner core simultaneously; The drawing program execution in step S203 of the image frame of left eye virtual camera, the drawing program execution in step S204 of the image frame of right eye virtual camera;
Step S203: make I=1, J=1, carry out as follows and calculate:
1. whether capable, the J row pixel emission chief ray (B01) of the I on the picture plane of left eye virtual camera from left eye virtual camera photocentre position is tested chief ray (B01) and is intersected with the surface of 3D virtual scene; If non-intersect, then the color of pixel that chief ray (B01) is corresponding is set to the background illumination value, changes 7., otherwise calculates nearest position of intersecting point p
i, utilize monte carlo method that light source is sampled, calculate intersection point p
iThe direct illumination value at place;
If 2. intersection point p
iThe reflection type on surface, place is a direct reflection, and then recurrence is followed the tracks of the direct reflection secondary light ray, calculates the indirect illumination value of direct reflection; If intersection point p
iThe reflection type on surface, place is diffuse reflection, then changes 3.; If intersection point p
iThe reflection type on surface, place is fuzzy reflection, then inquiry intersection point p in shared drive
iThe BRDF on place surface whether can be expressed as adding up of low order HSHBF and form, if can, then change 3., otherwise use monte carlo method antinode p
iLocal coordinate system (U
i, V
i, N
i) positive N
iImportance sampling (N is carried out in the hemisphere space
iWith intersection point p
iNormal direction in the same way), directly calculate the indirect illumination value of fuzzy reflection, change 6.;
3. incident radiation brightness buffer memory is carried out memory access and add latching operation; From incident radiation brightness buffer memory, search the incident radiation brightness record (B02) that satisfies following condition:
Condition A:
Wherein p is the location components member variable in the incident radiation brightness record (B02), and n is the normal line vector at the represented surface point place of the location components member variable of incident radiation brightness record (B02), n
iBe intersection point p
iUnit normal vector, R
iBe intersection point p
iTo the cllipsoidal harmonics mean distance of all viewable objects, a is given in advance precision threshold; Incident radiation brightness buffer memory is carried out the memory access unlocking operation;
4. a S set put in the satisfy condition incident radiation brightness record (B02) of A of all that will find, if 5. the S non-NULL then changes, otherwise utilize monte carlo method antinode p
iLocal coordinate system (U
i, V
i, N
i) positive N
iThe space angle sampling is carried out in the hemisphere space, calculates the incident radiation brightness value on each angle sample direction
K representes the angle sample number, calculates the HSHBF expansion coefficient of incident radiation brightness according to them
And translation gradient
L=0 wherein, 1 ..., n
v, m=-l ..., l, N are the angle hits; Create a new incident radiation brightness record item, its location components member variable assignment is p
i, local coordinate system component member variable assignment is (U
i, V
i, N
i), HSHBF expansion coefficient component member variable assignment does
HSHBF expansion coefficient translation gradient component member variable assignment does
Incident radiation brightness buffer memory is carried out memory access add latching operation; With this incident radiation brightness record item add in the incident radiation brightness buffer memory with position p
iIn the corresponding grid cell, simultaneously it is added in the S set; Incident radiation brightness buffer memory is carried out the memory access unlocking operation;
5. according to the local coordinate system component member variable of each element in the S set, rotate each element local coordinate system in case with intersection point p
iLocal coordinate system alignment, utilize interpolation method, according to the record of the incident radiation brightness in a S set interpolation calculation intersection point p based on gradient
iThe HSHBF expansion coefficient of incident radiation brightness
Calculate chief ray (B01) at intersection point p
iLocal coordinate system (U
i, V
i, N
i) positive N
iPairing angle in the hemisphere space
Will
Value as intersection point p
iThe indirect illumination value of fuzzy reflection, wherein
Be intersection point p
iThe HSHBF expansion coefficient of the BRDF on surface, place;
6. with intersection point p
iThe direct illumination value at place and the indirect illumination value addition of reflection are as intersection point p
iFinal illumination value result of calculation;
7. if J=J+1 is J>N
Pix, J=1 then, I=I+1, wherein N
PixThe pixel columns of the image frame of expression left eye virtual camera;
If 8. I<=M
Pix, M wherein
Pix1. the number of lines of pixels of the image frame of expression left eye virtual camera then changes, otherwise changes step S205;
Step S204: make I '=M
Pix, J '=N
Pix, M wherein
PixThe number of lines of pixels of the image frame of expression right eye virtual camera, N
PixThe pixel columns of the image frame of expression right eye virtual camera, carry out as follows and calculate:
1. whether I ' the row on the picture plane of right eye virtual camera, J ' row pixel emission chief ray (B03) from right eye virtual camera photocentre position are tested chief ray (B03) and are intersected with the surface of 3D virtual scene; If non-intersect, then the color of pixel that chief ray (B03) is corresponding is set to the background illumination value, changes 7., otherwise calculates nearest position of intersecting point p '
i, utilize monte carlo method that light source is sampled, calculate intersection point p '
iThe direct illumination value at place;
If 2. intersection point p '
iThe reflection type on surface, place is a direct reflection, and then recurrence is followed the tracks of the direct reflection secondary light ray, calculates the indirect illumination value of direct reflection; If intersection point p '
iThe reflection type on surface, place is diffuse reflection, then changes 3.; If intersection point p '
iThe reflection type on surface, place is fuzzy reflection, then inquiry intersection point p ' in shared drive
iThe BRDF on place surface whether can be expressed as adding up of low order HSHBF and form, if can, then change 3., otherwise use monte carlo method antinode p '
iLocal coordinate system (U '
i, V '
i, N '
i) positive N '
iThe hemisphere space carry out importance sampling (N '
iWith intersection point p '
iNormal direction in the same way), directly calculate the indirect illumination value of fuzzy reflection, change 6.;
3. incident radiation brightness buffer memory is carried out memory access and add latching operation; From incident radiation brightness buffer memory, search the incident radiation brightness record (B04) that satisfies following condition:
Condition A ':
P ' is the location components member variable in the incident radiation brightness record (B04), and n ' is the normal line vector at the represented surface point place of the location components member variable of incident radiation brightness record (B04), n '
iBe intersection point p '
iUnit normal vector, R
iBe intersection point p '
iTo the cllipsoidal harmonics mean distance of all viewable objects, a is given in advance precision threshold; Incident radiation brightness buffer memory is carried out the memory access unlocking operation;
4. all that will find satisfy condition the incident radiation brightness record (B04) of A ' put into a S set ', if 5. S ' non-NULL then changes, otherwise utilizes monte carlo method antinode p '
iLocal coordinate system (U '
i, V '
i, N '
i) positive N '
iThe space angle sampling is carried out in the hemisphere space, calculates the incident radiation brightness value on each angle sample direction
K representes the angle sample number, calculates the HSHBF expansion coefficient of incident radiation brightness according to them
And translation gradient
L=0 wherein, 1 ..., n
v, m=-l ..., l, N are the angle hits; Create a new incident radiation brightness record item, its location components member variable assignment is p '
i, local coordinate system component member variable assignment be (U '
i, V '
i, N '
i), HSHBF expansion coefficient component member variable assignment does
HSHBF expansion coefficient translation gradient component member variable assignment does
Incident radiation brightness buffer memory is carried out memory access add latching operation; With this incident radiation brightness record item add in the incident radiation brightness buffer memory with position p '
iIn the corresponding grid cell, simultaneously it is added S set ' in; Incident radiation brightness buffer memory is carried out the memory access unlocking operation;
5. according to S set ' in the local coordinate system component member variable of each element, rotate each element local coordinate system in case with intersection point p '
iLocal coordinate system alignment, utilize interpolation method based on gradient, according to S set ' in an incident radiation brightness record interpolation calculation intersection point p '
iThe HSHBF expansion coefficient of incident radiation brightness
Calculate chief ray (B03) at intersection point p '
iLocal coordinate system (U '
i, V '
i, N '
i) positive N '
iPairing angle in the hemisphere space
Will
Value as intersection point p '
iThe indirect illumination value of fuzzy reflection, wherein
Be intersection point p '
iThe HSHBF expansion coefficient of the BRDF on surface, place;
6. with intersection point p '
iThe direct illumination value at place and the indirect illumination value addition of reflection are as intersection point p '
iFinal illumination value result of calculation;
7. if J '=J '-1 is J '<1, then J '=N
Pix, I '=I '-1;
If 8. 1. then change I ' >=1, otherwise change step S205;
Step S205: the image frame of right and left eyes virtual camera is drawn and is finished.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN 201110225275 CN102306401B (en) | 2011-08-08 | 2011-08-08 | Left/right-eye three-dimensional picture drawing method for three-dimensional (3D) virtual scene containing fuzzy reflection effect |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN 201110225275 CN102306401B (en) | 2011-08-08 | 2011-08-08 | Left/right-eye three-dimensional picture drawing method for three-dimensional (3D) virtual scene containing fuzzy reflection effect |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102306401A true CN102306401A (en) | 2012-01-04 |
CN102306401B CN102306401B (en) | 2013-08-28 |
Family
ID=45380258
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN 201110225275 Expired - Fee Related CN102306401B (en) | 2011-08-08 | 2011-08-08 | Left/right-eye three-dimensional picture drawing method for three-dimensional (3D) virtual scene containing fuzzy reflection effect |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102306401B (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103995700A (en) * | 2014-05-14 | 2014-08-20 | 无锡梵天信息技术股份有限公司 | Method for achieving global illumination of 3D game engine |
CN105006011A (en) * | 2015-07-21 | 2015-10-28 | 长春理工大学 | Realistic three-dimensional scene body feeling interactive drawing system and method |
CN105447905A (en) * | 2015-11-17 | 2016-03-30 | 长春理工大学 | Three dimensional scene approximation soft shadow light tracking based on visible smooth filtering |
CN107274474A (en) * | 2017-07-03 | 2017-10-20 | 长春理工大学 | Indirect light during three-dimensional scenic stereoscopic picture plane is drawn shines multiplexing method |
CN109493409A (en) * | 2018-11-05 | 2019-03-19 | 长春理工大学 | Virtual three-dimensional scene stereoscopic picture plane method for drafting based on right and left eyes spatial reuse |
CN112002003A (en) * | 2020-08-26 | 2020-11-27 | 长春理工大学 | Spherical panoramic stereo picture generation and interactive display method for virtual 3D scene |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060055888A1 (en) * | 2004-09-16 | 2006-03-16 | Canon Kabushiki Kaisha | Projector-type image display apparatus |
CN101064327A (en) * | 2006-04-29 | 2007-10-31 | 联华电子股份有限公司 | Image sensing element and method for making the same |
-
2011
- 2011-08-08 CN CN 201110225275 patent/CN102306401B/en not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060055888A1 (en) * | 2004-09-16 | 2006-03-16 | Canon Kabushiki Kaisha | Projector-type image display apparatus |
CN101064327A (en) * | 2006-04-29 | 2007-10-31 | 联华电子股份有限公司 | Image sensing element and method for making the same |
Non-Patent Citations (2)
Title |
---|
WISKOTT, L. ETC.: "Face recognition by elastic bunch graph matching", 《IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE》 * |
费涨 等: "实时快速3D绘制空气中水滴反射效果", 《计算机工程与应用》 * |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103995700A (en) * | 2014-05-14 | 2014-08-20 | 无锡梵天信息技术股份有限公司 | Method for achieving global illumination of 3D game engine |
CN105006011A (en) * | 2015-07-21 | 2015-10-28 | 长春理工大学 | Realistic three-dimensional scene body feeling interactive drawing system and method |
CN105006011B (en) * | 2015-07-21 | 2017-08-25 | 长春理工大学 | The body feeling interaction formula drawing system and method for sense of reality three-dimensional scenic |
CN105447905A (en) * | 2015-11-17 | 2016-03-30 | 长春理工大学 | Three dimensional scene approximation soft shadow light tracking based on visible smooth filtering |
CN105447905B (en) * | 2015-11-17 | 2018-03-06 | 长春理工大学 | Three-dimensional scenic approximation soft shadows method for drafting based on observability smothing filtering |
CN107274474A (en) * | 2017-07-03 | 2017-10-20 | 长春理工大学 | Indirect light during three-dimensional scenic stereoscopic picture plane is drawn shines multiplexing method |
CN107274474B (en) * | 2017-07-03 | 2020-06-23 | 长春理工大学 | Indirect illumination multiplexing method in three-dimensional scene three-dimensional picture drawing |
CN109493409A (en) * | 2018-11-05 | 2019-03-19 | 长春理工大学 | Virtual three-dimensional scene stereoscopic picture plane method for drafting based on right and left eyes spatial reuse |
CN109493409B (en) * | 2018-11-05 | 2022-08-23 | 长春理工大学 | Virtual three-dimensional scene stereo picture drawing method based on left-right eye space multiplexing |
CN112002003A (en) * | 2020-08-26 | 2020-11-27 | 长春理工大学 | Spherical panoramic stereo picture generation and interactive display method for virtual 3D scene |
Also Published As
Publication number | Publication date |
---|---|
CN102306401B (en) | 2013-08-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN116483200B (en) | Computer-implemented method for determining virtual object depth and warping virtual content | |
Crassin et al. | Gigavoxels: Ray-guided streaming for efficient and detailed voxel rendering | |
CN108475497B (en) | System and method for rendering multiple levels of detail | |
CN102306401B (en) | Left/right-eye three-dimensional picture drawing method for three-dimensional (3D) virtual scene containing fuzzy reflection effect | |
CN112184575B (en) | Image rendering method and device | |
Weiskopf et al. | GPU‐based nonlinear ray tracing | |
US20170374344A1 (en) | Discontinuity-aware reprojection | |
Kang et al. | Two-view underwater 3D reconstruction for cameras with unknown poses under flat refractive interfaces | |
Okura et al. | Mixed-reality world exploration using image-based rendering | |
US20130027389A1 (en) | Making a two-dimensional image into three dimensions | |
CN103700134A (en) | Three-dimensional vector model real-time shadow deferred shading method based on controllable texture baking | |
Dos Santos et al. | Real time ray tracing for augmented reality | |
Baričević et al. | User-perspective augmented reality magic lens from gradients | |
CN102243768A (en) | Method for drawing stereo picture of three-dimensional virtual scene | |
CN117063205A (en) | Generating and modifying representations of dynamic objects in an artificial reality environment | |
Ganestam et al. | Real-time multiply recursive reflections and refractions using hybrid rendering | |
Franke | Delta light propagation volumes for mixed reality | |
US11748940B1 (en) | Space-time representation of dynamic scenes | |
CN115715464A (en) | Method and apparatus for occlusion handling techniques | |
Tredinnick et al. | Experiencing interior environments: New approaches for the immersive display of large-scale point cloud data | |
Baričević et al. | User-perspective AR magic lens from gradient-based IBR and semi-dense stereo | |
US11508119B2 (en) | Inverse path tracing for material and lighting estimation | |
US11062492B2 (en) | Method of image production | |
Dou et al. | Interactive three-dimensional display based on multi-layer LCDs | |
US11562529B2 (en) | Generating and modifying an artificial reality environment using occlusion surfaces at predetermined distances |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20130828 Termination date: 20140808 |
|
EXPY | Termination of patent right or utility model |