WO2013005278A1 - 表示装置 - Google Patents
表示装置 Download PDFInfo
- Publication number
- WO2013005278A1 WO2013005278A1 PCT/JP2011/065187 JP2011065187W WO2013005278A1 WO 2013005278 A1 WO2013005278 A1 WO 2013005278A1 JP 2011065187 W JP2011065187 W JP 2011065187W WO 2013005278 A1 WO2013005278 A1 WO 2013005278A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- angle
- image forming
- image
- epe
- display device
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/147—Optical correction of image distortions, e.g. keystone
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/28—Reflectors in projection beam
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/011—Head-up displays characterised by optical features comprising device for correcting geometrical aberrations, distortion
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0123—Head-up displays characterised by optical features comprising devices increasing the field of view
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0149—Head-up displays characterised by mechanical features
- G02B2027/0161—Head-up displays characterised by mechanical features characterised by the relative positioning of the constitutive elements
Definitions
- the present invention relates to a display device for visually recognizing an image as a virtual image.
- a display device such as a head-up display that allows a user to visually recognize an image as a virtual image from the position (eye point) of the user's eyes.
- a display device such as a head-up display that allows a user to visually recognize an image as a virtual image from the position (eye point) of the user's eyes.
- two concave mirrors for enlarging an image are provided in a light path from an image display device to an observer, and a virtual image is formed by reflecting the image of the image display device twice.
- a technique for improving the magnification ratio of a virtual image and the shape distortion of a virtual image has been proposed.
- the display surface of the image display element and the projection surface of the projection optical system are decentered in the YZ section, and the projection surface is a circle with a string directed toward the projection optical system side in the XZ section.
- a projection optical system that has an arcuate curved surface and has at least one rotationally asymmetric correction optical surface that corrects image distortion has been
- the virtual image tends to be distorted.
- distortions having different shapes tend to occur in virtual images observed with the left and right eyes. In this case, the user may be fatigued.
- the distortion of the virtual image can be reduced.
- the entire head-up display becomes large, which is disadvantageous for mounting in the vehicle interior. There is a case.
- an aspherical or free-form optical element is used as in Patent Document 2, it is known that the distortion of a virtual image can be made minute, but inexpensive manufacturing is difficult.
- An object of the present invention is to provide a display device that can appropriately suppress distortion of a virtual image with a simple configuration.
- the display device includes an image forming element that forms an image to be displayed, and an optical element that displays a virtual image by reflecting light emitted from the image forming element.
- the first angle which is the angle of the image forming element with respect to the traveling direction of light traveling from the center of the image forming element to the center of the optical element, is such that distortion of the virtual image displayed by the optical element is reduced. It is set according to a second angle that is an angle of the optical element with respect to the traveling direction of the light.
- the display device includes an image forming element that forms an image to be displayed, an optical element that displays a virtual image by reflecting light emitted from the image forming element, and the image formation.
- a first angle changing means for changing an angle of the optical element with respect to a traveling direction of light traveling from the center of the element to the center of the optical element; and the virtual image resulting from a change in the angle of the optical element with respect to the traveling direction of the light.
- Second angle changing means for changing the angle of the image forming element with respect to the traveling direction of the light so as to reduce the distortion of the light.
- FIG. 1 shows an overall configuration of a display device according to a first embodiment.
- the structure of a laser projector is shown.
- the figure for demonstrating the tilt angle of EPE and a combiner is shown.
- regulates the tilt angle of EPE and a combiner is shown.
- the image formed in EPE in simulation is shown.
- the figure for demonstrating the simulation conditions in an eyebox center is shown.
- the example of a simulation result in the eyebox center in the first embodiment is shown.
- the figure for demonstrating the simulation conditions with both eyes is shown.
- the simulation result example with both eyes in 1st Example is shown.
- the figure for demonstrating the reason which can suppress distortion of a virtual image by 1st Example is shown.
- the relationship between the tilt angle of the EPE and the amount of distortion of the virtual image is shown for each radius of curvature of the combiner.
- An example of simulation results when the tilt angle of the EPE is equal to the tilt angle of the combiner is shown.
- the specific example of the arrangement state of a laser projector and EPE is shown.
- amending the distortion of the image formed on EPE is shown.
- the whole structure of the display apparatus which concerns on 3rd Example is shown.
- the example of a simulation result in the eyebox center in 3rd Example is shown.
- the simulation result example with both eyes in 3rd Example is shown.
- the whole structure of the display apparatus which concerns on the modification 1 is shown.
- the whole structure of the display apparatus which concerns on the modification 2 is shown.
- the whole structure of the display apparatus which concerns on the modification 3 is shown.
- An example of a simulation result when using a field lens according to Modification 4 is shown.
- a display device includes: an image forming element that forms an image to be displayed; and an optical element that displays a virtual image by reflecting light emitted from the image forming element.
- the first angle which is the angle of the image forming element with respect to the traveling direction of light traveling from the center of the image forming element to the center of the optical element, is reduced so that distortion of the virtual image displayed by the optical element is reduced. It is set according to a second angle which is an angle of the optical element with respect to the traveling direction of light.
- the above display device is a device for visually recognizing an image as a virtual image.
- the image forming element forms an image to be displayed, and the optical element displays a virtual image by reflecting the light emitted from the image forming element.
- the “first angle” is defined as the angle of the image forming element with respect to the traveling direction of light traveling from the center of the image forming element to the center of the optical element
- the “second angle” is defined from the center of the image forming element to the optical element. It is defined as the angle of the optical element with respect to the traveling direction of the light traveling toward the center.
- the first angle of the image forming element is set according to the second angle of the optical element so that the distortion of the virtual image displayed by the optical element is reduced.
- the image forming element is inclined by the first angle according to the second angle of the optical element so that the distortion of the virtual image is reduced. According to the display device, it is possible to appropriately suppress the distortion of the virtual image.
- the angle of the image forming element with respect to the light traveling direction is changed so as to reduce distortion of the virtual image due to a change in the angle of the optical element with respect to the light traveling direction.
- Angle changing means is further provided.
- the display device further includes means for changing the angle of the optical element.
- the first angle is set to be larger than the second angle. Therefore, distortion of a virtual image can be suppressed effectively.
- the first angle is set to be approximately twice the second angle.
- distortion of a virtual image can be controlled more effectively.
- the trapezoidal distortion of the virtual image, the difference in the aspect ratio between the image formed on the image forming element and the virtual image, and the difference in the shape of the virtual image observed with both eyes can be appropriately suppressed. Further, by suppressing the difference in the shape of the virtual image observed with both eyes, it becomes easy to correct the distortion of the virtual image by image processing.
- the optical element has a concave shape in the traveling direction of the light emitted from the image forming element, and the image forming element has a planar shape.
- the angle changing means reduces the trapezoidal distortion of the virtual image and / or the difference in aspect ratio between the image formed on the image forming element and the virtual image.
- the angle of the image forming element with respect to the traveling direction can be changed.
- the angle changing unit is configured to form the image with respect to the traveling direction of the light while maintaining a relative angular relationship between the image forming element and the optical element under a predetermined condition.
- the angle of the element is changed. More preferably, the angle changing unit changes the angle of the image forming element with respect to the light traveling direction to approximately twice the angle of the optical element with respect to the light traveling direction.
- a virtual image distortion correction element is disposed between the image forming element and the optical element.
- the virtual image distortion correction element corrects bow distortion of the virtual image caused by the curvature of the optical element.
- the image forming element is an exit pupil enlarging element that enlarges an exit pupil of light emitted from a light source.
- the exit pupil enlarging element is a microlens array in which a plurality of microlenses are arranged.
- the display device further includes correction means for correcting distortion of an image formed on the image forming element by light emitted from the light source.
- the correction unit corrects distortion of the image caused by an angle of light emitted from the light source with respect to the image forming element.
- a liquid crystal display or an organic EL display can be used as the image forming element.
- the first angle is perpendicular to the traveling direction of light traveling from the center of the image forming element to the center of the optical element, and with respect to an axis passing through the center of the image forming element.
- An angle formed by a forming element, and the second angle is perpendicular to a traveling direction of light traveling from the center of the image forming element to the center of the optical element and passes through the center of the optical element. The angle formed by the optical element.
- the display device includes an image forming element that forms an image to be displayed, an optical element that displays a virtual image by reflecting light emitted from the image forming element, and the image forming element.
- First angle changing means for changing the angle of the optical element with respect to the traveling direction of light traveling from the center of the optical element to the center of the optical element, and the virtual image resulting from the change in the angle of the optical element with respect to the traveling direction of the light
- Second angle changing means for changing an angle of the image forming element with respect to a traveling direction of the light so as to reduce distortion.
- FIG. 1 is a block diagram schematically showing an overall configuration of a display device 100 according to the first embodiment. Here, the outline
- the display device 100 mainly includes a laser projector 1, an exit pupil expanding element (hereinafter, appropriately referred to as “EPE (Exit-Pupil Expander)”) 11, A field lens 12 and a combiner 13 are included.
- the display device 100 is a device that visually recognizes an image as a virtual image from the position (eye point) of the user's eyes.
- the display device 100 is used as a head-up display, for example.
- the laser projector 1 includes a red, green, and blue laser light source, a scanning mechanism that scans laser light emitted from the laser light source, and a control unit that controls these.
- the light emitted from the laser projector 1 enters the EPE 11. Details of the laser projector 1 will be described later.
- the EPE 11 enlarges the exit pupil of the light emitted from the laser projector 1 and forms an intermediate image of the image presented to the user.
- the EPE 11 is a microlens array in which a plurality of microlenses are arranged.
- the light emitted from the EPE 11 enters the field lens 12.
- the EPE 11 corresponds to an example of the “image forming element” in the present invention. Note that an image without distortion is formed on the EPE 11 as an intermediate image.
- the field lens 12 is a lens disposed on the exit side of the EPE 11.
- the field lens 12 has a planar surface on which light from the EPE 11 is incident (that is, an incident surface), and a convex surface having a spherical surface on the side opposite to the surface on which the light from the EPE 11 is incident (that is, an exit surface). It is configured in shape.
- the field lens 12 is spaced apart from the EPE 11 so that the planar surface is parallel to the surface of the EPE 11.
- the field lens 12 is arranged such that an axis connecting the center of the field lens 12 and the center of the EPE 11 is parallel to an axis extending in the vertical direction of the EPE 11.
- the light emitted from the field lens 12 enters the combiner 13.
- the field lens 12 corresponds to an example of a “virtual image distortion correction element” in the present invention.
- the combiner 13 is a half mirror that displays the image corresponding to the incident light as an enlarged virtual image by reflecting the light from the field lens 12.
- the combiner 13 is configured to have a concave surface having a spherical surface on which light from the field lens 12 is incident (that is, an incident surface).
- the user observes a virtual image from an eye point that is a predetermined distance away from the combiner 13.
- the combiner 13 corresponds to an example of the “optical element” in the present invention.
- the EPE 11 and the field lens 12 are integrated with the laser projector 1. That is, in the display device 100, the laser projector 1, the EPE 11, and the field lens 12 are not configured separately, but the laser projector 1, the EPE 11, and the field lens 12 are configured as an integrated device.
- the laser projector 1 includes an image signal input unit 2, a video ASIC 3, a frame memory 4, a ROM 5, a RAM 6, a laser driver ASIC 7, a MEMS control unit 8, and a laser light source unit 9.
- the MEMS mirror 10 is provided.
- the image signal input unit 2 receives an image signal input from the outside and outputs it to the video ASIC 3.
- the video ASIC 3 is a block that controls the laser driver ASIC 7 and the MEMS control unit 8 based on the image signal input from the image signal input unit 2 and the scanning position information Sc input from the MEMS mirror 10, and is ASIC (Application Specific Integrated). Circuit).
- the video ASIC 3 includes a synchronization / image separation unit 31, a bit data conversion unit 32, a light emission pattern conversion unit 33, and a timing controller 34.
- the synchronization / image separation unit 31 separates the image data displayed on the EPE 11 serving as the image display unit and the synchronization signal from the image signal input from the image signal input unit 2 and writes the image data to the frame memory 4.
- the bit data converter 32 reads the image data written in the frame memory 4 and converts it into bit data.
- the light emission pattern conversion unit 33 converts the bit data converted by the bit data conversion unit 32 into a signal representing the light emission pattern of each laser.
- the timing controller 34 controls the operation timing of the synchronization / image separation unit 31 and the bit data conversion unit 32.
- the timing controller 34 also controls the operation timing of the MEMS control unit 8 described later.
- the image data separated by the synchronization / image separation unit 31 is written.
- the ROM 5 stores a control program and data for operating the video ASIC 3. Various data are sequentially read from and written into the RAM 6 as a work memory when the video ASIC 3 operates.
- the laser driver ASIC 7 is a block that generates a signal for driving a laser diode provided in a laser light source unit 9 described later, and is configured as an ASIC.
- the laser driver ASIC 7 includes a red laser driving circuit 71, a blue laser driving circuit 72, and a green laser driving circuit 73.
- the red laser driving circuit 71 drives the red laser LD1 based on the signal output from the light emission pattern conversion unit 33.
- the blue laser drive circuit 72 drives the blue laser LD2 based on the signal output from the light emission pattern conversion unit 33.
- the green laser drive circuit 73 drives the green laser LD3 based on the signal output from the light emission pattern conversion unit 33.
- the MEMS control unit 8 controls the MEMS mirror 10 based on a signal output from the timing controller 34.
- the MEMS control unit 8 includes a servo circuit 81 and a driver circuit 82.
- the servo circuit 81 controls the operation of the MEMS mirror 10 based on a signal from the timing controller.
- the driver circuit 82 amplifies the control signal of the MEMS mirror 10 output from the servo circuit 81 to a predetermined level and outputs the amplified signal.
- the laser light source unit 9 emits laser light to the MEMS mirror 10 based on the drive signal output from the laser driver ASIC 7.
- the MEMS mirror 10 as a scanning unit reflects the laser light emitted from the laser light source unit 9 toward the EPE 11. By doing so, the MEMS mirror 10 forms an image to be displayed on the EPE 11.
- the MEMS mirror 10 moves so as to scan on the EPE 11 under the control of the MEMS control unit 8 in order to display an image input to the image signal input unit 2, and scan position information (for example, at that time) Information such as the angle of the mirror) is output to the video ASIC 3.
- the laser light source unit 9 includes a case 91, a wavelength selective element 92, a collimator lens 93, a red laser LD1, a blue laser LD2, a green laser LD3, a monitor light receiving element (hereinafter simply referred to as “light receiving element”). 50).
- light receiving element a monitor light receiving element
- the red laser LD1, the blue laser LD2, and the green laser LD3 are not distinguished, they are simply referred to as “laser LD”.
- the case 91 is formed in a substantially box shape with resin or the like.
- the case 91 is provided with a hole penetrating into the case 91 and a CAN attachment portion 91a having a concave cross section, and a surface perpendicular to the CAN attachment portion 91a. A hole penetrating inward is formed, and a collimator mounting portion 91b having a concave cross section is formed.
- the wavelength-selective element 92 as a synthesis element is configured by, for example, a trichromatic prism, and is provided with a reflective surface 92a and a reflective surface 92b.
- the reflection surface 92a transmits the laser light emitted from the red laser LD1 toward the collimator lens 93, and reflects the laser light emitted from the blue laser LD2 toward the collimator lens 93.
- the reflecting surface 92b transmits most of the laser light emitted from the red laser LD1 and the blue laser LD2 toward the collimator lens 93 and reflects a part thereof toward the light receiving element 50.
- the reflection surface 92 b reflects most of the laser light emitted from the green laser LD 3 toward the collimator lens 93 and transmits part of the laser light toward the light receiving element 50. In this way, the emitted light from each laser is superimposed and incident on the collimator lens 93 and the light receiving element 50.
- the wavelength selective element 92 is provided in the vicinity of the collimator mounting portion 91b in the case 91.
- the collimator lens 93 converts the laser light incident from the wavelength selective element 92 into parallel light and emits it to the MEMS mirror 10.
- the collimator lens 93 is fixed to the collimator mounting portion 91b of the case 91 with a UV adhesive or the like. That is, the collimator lens 93 is provided after the synthesis element.
- the red laser LD1 as a laser light source emits red laser light.
- the red laser LD1 is fixed at a position that is coaxial with the wavelength selective element 92 and the collimator lens 93 in the case 91 while the semiconductor laser light source is in the chip state or the chip is mounted on a submount or the like.
- a blue laser LD2 as a laser light source emits blue laser light.
- the blue laser LD2 is fixed at a position where the emitted laser light can be reflected toward the collimator lens 93 by the reflecting surface 92a while the semiconductor laser light source is in the chip state or the chip is mounted on the submount or the like.
- the positions of the red laser LD1 and the blue laser LD2 may be switched.
- the green laser LD3 as a laser light source is in a state of being attached to the CAN package or in a state of being attached to the frame package, and emits green laser light.
- the green laser LD 3 has a semiconductor laser light source chip B that generates green laser light in a CAN package, and is fixed to a CAN mounting portion 91 a of the case 91.
- the light receiving element 50 receives a part of the laser light emitted from each laser light source.
- the light receiving element 50 is a photoelectric conversion element such as a photodetector, and supplies a detection signal Sd, which is an electrical signal corresponding to the amount of incident laser light, to the laser driver ASIC 7.
- the tilt angle ⁇ of the EPE 11 and the tilt angle ⁇ of the combiner 13 will be described with reference to FIG. Note that the tilt angle ⁇ corresponds to a “first angle” in the present invention, and the tilt angle ⁇ corresponds to a “second angle” in the present invention.
- an axis a corresponding to a line connecting the center C1 of the EPE 11 and the center C2 of the combiner 13 is defined.
- the angle at which the EPE 11 is inclined with respect to the axis b perpendicular to the axis a and the axis d perpendicular to the axis a and passing through the center C1 of the EPE 11 is the tilt angle of the EPE 11. Define as ⁇ .
- the angle at which the combiner 13 is inclined with respect to the axis c that intersects the axis a and intersects the axis e parallel to the axis d and passes through the center C2 of the combiner 13 is defined.
- the tilt angles ⁇ and ⁇ are defined as “positive” in the counterclockwise direction with respect to the respective axes c and b.
- the field lens 12 is arranged so that the plane-shaped surface thereof is parallel to the surface of the EPE 11, the field lens 12 is also tilted with respect to the axis orthogonal to the axis a, like the EPE 11. Tilt at an angle ⁇ .
- the above-mentioned axis a will be supplemented.
- the “center C1” in the EPE 11 is defined as the center of the image formed on the EPE 11, and the “center C2” in the combiner 13 is a virtual image from the eye point. Is defined as the intersection of the line-of-sight direction looking at the center C ⁇ b> 3 and the reflecting surface of the combiner 13.
- the axis a is defined as a line connecting the centers C1 and C2. Even when the field lens 12 is used, the definition of the axis a is still a line connecting the center C1 and the center C2.
- the tilt angle ⁇ of the EPE 11 is set from the viewpoint of suppressing the distortion of the virtual image displayed by the combiner 13 (for example, from the viewpoint of suppressing the difference in the shape of the virtual image observed by the left and right eyes). This is set based on the tilt angle ⁇ of the combiner 13. Specifically, the tilt angle ⁇ of the EPE 11 is set to approximately twice the tilt angle ⁇ of the combiner 13 ( ⁇ 2 ⁇ ). In other words, the EPE 11 is installed to be inclined with respect to the axis b by an angle that is approximately twice the angle at which the combiner 13 is inclined with respect to the axis c.
- the tilt angle ⁇ of the combiner 13 is 12 degrees.
- the distance between the EPE 11 and the plane of the field lens 12 is 2 mm.
- -Distance Y1 170 mm between the center C1 of the EPE 11 and the center C2 of the combiner 13
- -Distance Y2 between the center C2 of the combiner 13 and the eye point: 500 mm
- the size of the intermediate image formed on the EPE 11 horizontal length 75 mm ⁇ vertical length 25 mm (The intermediate image has a lattice in which squares without distortion are arranged as shown in FIG.
- FIG. 7 shows an example of simulation results at the eyebox center.
- the virtual image obtained by the simulation is indicated by a solid line
- the shape of the reference screen for comparison of the virtual image size and shape is indicated by a one-dot chain line.
- the size of the reference screen is “horizontal length 450 (mm) ⁇ vertical length 150 (mm)”.
- FIG. 7A shows an example of a simulation result when the field lens 12 is not installed between the EPE 11 and the combiner 13.
- FIG. 7A shows that the virtual image is distorted. Specifically, it can be seen that a trapezoidal distortion of a virtual image and a bow distortion of the virtual image are generated.
- FIG. 7B shows an example of simulation results when the field lens 12 is not installed between the EPE 11 and the combiner 13. From FIG. 7B, it can be seen that the trapezoidal distortion of the virtual image is appropriately removed as compared with FIG. 7A.
- FIG. 7C shows a simulation result example when the field lens 12 is installed between the EPE 11 and the combiner 13. From FIG. 7 (c), it can be seen that the bow distortion of the virtual image is appropriately removed as compared with FIG. 7 (b).
- FIG. 9 shows an example of a simulation result with both eyes.
- the virtual image obtained from the simulation is indicated by a solid line and a broken line
- the shape of the reference screen is indicated by a one-dot chain line.
- a virtual image observed with the left eye is indicated by a solid line
- a virtual image observed with the right eye is indicated by a broken line.
- FIG. 9A shows an example of a simulation result when the field lens 12 is not installed between the EPE 11 and the combiner 13. It can be seen from FIG. 9A that virtual image distortion such as trapezoidal distortion and bow distortion occurs. It can also be seen that the shape of the distortion of the virtual image observed with the left eye is different from the shape of the distortion of the virtual image observed with the right eye.
- FIG. 9B shows an example of a simulation result when the field lens 12 is not installed between the EPE 11 and the combiner 13. From FIG. 9B, it can be seen that the trapezoidal distortion of the virtual image is appropriately removed as compared with FIG. 9A. Moreover, it turns out that the difference in the shape of the virtual image observed with each of both eyes is suppressed appropriately.
- FIG. 9C shows a simulation result example when the field lens 12 is installed between the EPE 11 and the combiner 13. From FIG. 9 (c), it can be seen that the bow distortion of the virtual image is appropriately removed as compared with FIG. 9 (b). Further, it can be seen that the state in which the difference in the shape of the virtual image with both eyes is suppressed as shown in FIG. 9B is appropriately maintained.
- the distortion of the virtual image can be appropriately suppressed by setting the tilt angle ⁇ of the EPE 11 to be approximately twice the tilt angle ⁇ of the combiner 13. Specifically, the trapezoidal distortion of the virtual image and the difference in the shape of the virtual image observed with both eyes can be appropriately suppressed. Further, according to the first embodiment, by inserting the field lens 12 on the exit side of the EPE 11, it is possible to appropriately suppress the bow distortion of the virtual image.
- FIG. 10A shows a side view of the EPE 11 and the combiner 13.
- FIG.10 (b) shows the top view of EPE11 and the combiner 13 observed from the direction shown by arrow A1 in Fig.10 (a).
- 10A and 10B when the tilt angle ⁇ of the EPE 11 is set to “0 degree”, that is, the tilt angle ⁇ of the EPE 11 is not set to approximately twice the tilt angle ⁇ of the combiner 13. Indicates the state of the case. In this case, as shown in FIG.
- the position of the surface of the combiner 13 to which the light beam 1 emitted from the upper part of the EPE 11 arrives is a position indicated by reference numeral 13a
- the light beam 2 emitted from the lower part of the EPE 11 The position of the surface of the combiner 13 that reaches is a position as indicated by reference numeral 13b. From this, it can be seen that the distance until the light beam 2 reaches the surface of the combiner 13 is longer than the distance until the light beam 1 reaches the surface of the combiner 13. Due to such a difference in distance, the height of the light ray in the horizontal direction that hits the surface of the combiner 13 is increased, and it is assumed that the virtual image distortion spreading downward is generated.
- FIG. 10C is a side view of the EPE 11 and the combiner 13 when the tilt angle ⁇ of the EPE 11 is set to be approximately twice the tilt angle ⁇ of the combiner 13.
- the distance until the light beam 1 emitted from the upper part of the EPE 11 reaches the surface of the combiner 13 and the distance until the light beam 2 emitted from the lower part of the EPE 11 reaches the surface of the combiner 13 are: It becomes almost equal. Therefore, it is considered that the distortion of the virtual image as described above is reduced.
- the tilt angle ⁇ of the EPE 11 is not limited to being set to twice the tilt angle ⁇ of the combiner 13. As described above, since the distortion of the virtual image can be reduced if the distances until the light beams emitted from various places of the EPE 11 reach the surface of the combiner 13 are substantially equal, in one example, the EPE 11 The tilt angle ⁇ of the EPE 11 can be set according to the tilt angle ⁇ of the combiner 13 so that the distances until the light beams emitted from various places reach the surface of the combiner 13 are substantially equal. For example, the tilt angle ⁇ of the EPE 11 can be set to an angle that is at least larger than the tilt angle ⁇ of the combiner 13.
- the tilt angle ⁇ of the EPE 11 may be twice the tilt angle ⁇ of the combiner 13.
- the tilt angle ⁇ of the EPE 11 may not be strictly set to twice the tilt angle ⁇ of the combiner 13, and the tilt angle ⁇ of the EPE 11 may be set to an angle close to twice the tilt angle ⁇ of the combiner 13. good. If the tilt angle ⁇ of the EPE 11 is approximately twice the tilt angle ⁇ of the combiner 13, it is assumed that the distances until the light beams emitted from various parts of the EPE 11 reach the surface of the combiner 13 are approximately equal. Because it is done.
- the result when the radius of curvature of the combiner 13 is 400 mm is compared with the result when the radius of curvature of the combiner 13 is 500 mm.
- the simulation is performed using the following parameters. A case where the radius of curvature of the combiner 13 is 400 mm is appropriately expressed as “R400”, and a case where the radius of curvature of the combiner 13 is 500 mm is appropriately expressed as “R500”.
- FIG. 11 shows an example of a simulation result of R400 and R500.
- FIG. 11A shows the amount of distortion of the virtual image at R400 and R500 when the tilt angle of the EPE 11 is set to various values.
- the “distortion amount of the virtual image” is as indicated by the solid line arrow X2 in FIGS. 11B and 11C with respect to the vertical line of the frame as shown by the one-dot chain line X1 in FIGS. 11B and 11C. It is defined by the absolute value of the angle of the vertical line at the end of the virtual image.
- FIG. 11A shows the amount of distortion of the virtual image at R400 and R500 when the tilt angle of the EPE 11 is set to various values.
- the “distortion amount of the virtual image” is as indicated by the solid line arrow X2 in FIGS. 11B and 11C with respect to the vertical line of the frame as shown by the one-dot chain line X1 in FIGS. 11B and 11C. It is defined by the absolute
- FIG. 11B shows the virtual image distortion observed at the eye box center when the tilt angle ⁇ of the EPE 11 is set to 0 degree when R400 is used
- FIG. 11C uses R400.
- the virtual image distortion observed at the eye box center when the tilt angle ⁇ of the EPE 11 is set to 32 degrees is shown.
- the tilt angle ⁇ of the EPE 11 that minimizes the amount of distortion of the virtual image is 24 degrees (twice the tilt angle ⁇ of the combiner 13) regardless of whether the radius of curvature of the combiner 13 is 400 mm or 500 mm. . That is, it can be said that the optimum setting value regarding the tilt angle ⁇ of the EPE 11 does not change even if the curvature radius of the combiner 13 is 400 mm or 500 mm.
- FIG. 12 is a diagram for explaining that the distortion of the virtual image is reduced even when the tilt angle ⁇ of the EPE 11 is set to be equal to the tilt angle ⁇ of the combiner.
- FIG. 12A shows an example of the R400 simulation result
- FIG. 12B shows an example of the R500 simulation result.
- FIG. 12A shows the virtual image distortion observed at the eye box center when the tilt angle ⁇ of the EPE 11 is set to 12 degrees when R400 is used
- FIG. The virtual image distortion observed at the eye box center when the tilt angle ⁇ of the EPE 11 is set to 12 degrees when R500 is used.
- the simulation parameters are the same as those used in the simulation shown in FIG.
- the distortion of the virtual image can be reduced to some extent even when the tilt angle ⁇ of the EPE 11 is set to the same level as the tilt angle ⁇ of the combiner. If the tilt angle ⁇ of the EPE 11 is further increased from this state, it is presumed that the distortion of the virtual image is further reduced.
- the second embodiment is different from the first embodiment in that the distortion of the image formed on the EPE 11 is corrected by the light emitted from the laser projector 1. Specifically, in the second embodiment, the light emitted from the laser projector 1 does not enter the EPE 11 vertically when the light emitted from the laser projector 1 does not enter the EPE 11 vertically. The distortion of the image formed on the EPE 11 that may be caused by this is corrected.
- the configuration not particularly described here is the same as that of the first embodiment.
- the configuration in which the tilt angle ⁇ of the EPE 11 is set to approximately twice the tilt angle ⁇ of the combiner 13 is also applied to the second embodiment.
- FIG. 13 shows a specific example of the arrangement state of the laser projector 1 and the EPE 11.
- FIG. 13A shows an arrangement state in which light from the laser projector 1 enters the EPE 11 perpendicularly. In this arrangement state, light for drawing the center of the image is projected perpendicularly from the laser projector 1 to the EPE 11.
- FIG. 13B shows an arrangement state in which light from the laser projector 1 does not enter the EPE 11 perpendicularly. In this arrangement state, light for drawing the center of the image is projected obliquely from the laser projector 1 to the EPE 11 at a predetermined incident angle ( ⁇ 0 degree).
- trapezoidal distortion may occur in an image formed on the EPE 11 (meaning a real image; the same shall apply hereinafter).
- processing for correcting such trapezoidal distortion is performed.
- the arrangement state shown in FIG. 13A basically, trapezoidal distortion does not occur in the image formed on the EPE 11.
- the arrangement state of FIG. 13A it is difficult to ensure the head clearance when the display device 100 is installed in the upper portion of the vehicle interior. Therefore, it can be said that the arrangement state of FIG. 13B is more advantageous than the arrangement state of FIG. 13A from the viewpoint of securing the head clearance when the display device 100 is installed in the upper part of the vehicle interior.
- FIG. 14 is a diagram for explaining a method of correcting the trapezoidal distortion of an image formed on the EPE 11.
- FIG. 14A shows a specific example of trapezoidal distortion of an image formed on the EPE 11 that can occur when the arrangement state shown in FIG. 13B is applied.
- an image that cancels such trapezoidal distortion is drawn on the EPE 11 by the laser projector 1.
- an image whose length gradually changes from the upper side to the lower side as shown in FIG. 14B is drawn on the EPE 11 by compressing the vertical width of the input image. Forms a lattice image without distortion as shown in FIG.
- the laser driver ASIC 7 (see FIG. 2) in the laser projector 1 performs control to change the light emission period of the laser LD in the laser light source unit 9 (that is, control to change the timing at which the laser LD emits light). Then, an image as shown in FIG. 14B is drawn by the laser projector 1 so that the number of pixels does not change in any horizontal line of the image formed on the EPE 11. In this example, the laser driver ASIC 7 performs control so that the light emission period of the laser LD gradually decreases from the top to the bottom of the image formed on the EPE 11 (in other words, the bottom of the image formed on the EPE 11). From the top to the top, control is performed so that the light emission period of the laser LD gradually increases).
- the MEMS control unit 8 controls the MEMS mirror 10 to reduce the interval between horizontal lines to be drawn with respect to the input image, in other words, to increase the line density.
- the video ASIC 3 (see FIG. 2) in the laser projector 1 performs image processing on the image input from the image signal input unit 2 to obtain an image as shown in FIG. Draw on EPE11.
- the video ASIC 3 performs image processing for transforming the original image input from the image signal input unit 2 into an image as shown in FIG.
- the laser driver ASIC 7 and the video ASIC 3 in the laser projector 1 correspond to an example of “correction means” in the present invention.
- the display device 100 when the display device 100 is installed in the vehicle interior, it is possible to adopt an appropriate arrangement that ensures head clearance while suppressing the occurrence of problems such as distortion of the image formed on the EPE 11. Become.
- the tilt angle ⁇ of the EPE 11 is automatically changed according to the changed tilt angle ⁇ . And different. Specifically, in the third embodiment, control is performed to change the tilt angle ⁇ of the EPE 11 while maintaining the relative angular relationship between the EPE 11 and the combiner 13 under a predetermined condition. Specifically, control is performed to change the tilt angle ⁇ of the EPE 11 in accordance with the change of the tilt angle ⁇ of the combiner 13 so that the condition “ ⁇ 2 ⁇ ” is satisfied.
- the driver when the display device is installed and used in the passenger compartment, the driver's eye point varies depending on the driver's sitting height, seat position, and the like, so the driver adjusts the tilt angle ⁇ of the combiner 13. There is a tendency. Therefore, in the third embodiment, assuming that the driver changes the tilt angle ⁇ of the combiner 13, when the tilt angle ⁇ is changed, the tilt angle ⁇ of the EPE 11 is automatically changed accordingly. It was.
- FIG. 15 is a block diagram schematically showing the overall configuration of the display apparatus 101 according to the third embodiment.
- symbol is attached
- the display device 101 according to the third embodiment is different from the display device 100 according to the first embodiment in that it includes a laser projector 1a instead of the laser projector 1 and also includes an angle sensor 15 and an actuator 16.
- the angle sensor 15 detects the tilt angle ⁇ of the combiner 13.
- the actuator 16 performs control to change the tilt angle ⁇ of the EPE 11.
- the laser projector 1a includes the same components (see FIG. 2) as the laser projector 1, and also includes a control unit 1aa that controls the actuator 16.
- the control unit 1aa in the laser projector 1 acquires a detection signal S15 corresponding to the tilt angle ⁇ detected by the angle sensor 15, and generates a control signal S16 for changing the tilt angle ⁇ of the EPE 11 based on the detection signal S15. Supply to the actuator 16. Specifically, the control unit 1aa performs control to set the tilt angle ⁇ of the EPE 11 to approximately twice the tilt angle ⁇ of the combiner 13. That is, when the tilt angle ⁇ of the combiner 13 is changed, the control unit 1aa performs control to change the tilt angle ⁇ of the EPE 11 to an angle approximately twice the changed tilt angle ⁇ .
- the controller 1aa and the actuator 16 correspond to an example of “angle changing means” in the present invention.
- FIG. 16 shows an example of a simulation result at the eye box center.
- the virtual image obtained from the simulation is shown by a solid line
- the shape of the reference screen is shown by a one-dot chain line.
- An example of a simulation result in the case of maintaining the above is shown.
- FIG. 16A shows an example of simulation results when the field lens 12 is not installed between the EPE 11 and the combiner 13. It can be seen from FIG. 16A that a virtual image is distorted. Specifically, when compared with the result in the initial state before changing the tilt angle ⁇ of the combiner 13 shown in FIG.
- FIG. 16B shows a simulation result example when the field lens 12 is installed between the EPE 11 and the combiner 13. From FIG. 16 (b), compared with FIG. 16 (a), the bow distortion of the virtual image is suppressed, but the initial state before changing the tilt angle ⁇ of the combiner 13 shown in FIG. 7 (c). It can be seen that a new distortion of the virtual image is generated as compared with the result of.
- An example of a simulation result when changed is shown.
- FIG. 16C shows a simulation result example when the field lens 12 is not installed between the EPE 11 and the combiner 13.
- FIG. 16C shows that the distortion of the virtual image is appropriately reduced as compared with FIG.
- FIG. 16D shows a simulation result example when the field lens 12 is installed between the EPE 11 and the combiner 13. From FIG. 16D, it can be seen that the distortion of the virtual image is appropriately removed as compared with FIG. 16B. Further, as compared with FIG. 16C, it can be seen that the bow distortion of the virtual image is appropriately removed.
- FIG. 17 shows an example of a simulation result with both eyes.
- the virtual image obtained from the simulation is indicated by a solid line and a broken line
- the shape of the reference screen is indicated by a one-dot chain line.
- a virtual image observed with the left eye is indicated by a solid line
- a virtual image observed with the right eye is indicated by a broken line.
- FIG. 17A shows an example of a simulation result when the field lens 12 is not installed between the EPE 11 and the combiner 13. It can be seen from FIG. 17A that a virtual image is distorted. Specifically, when compared with the result in the initial state before changing the tilt angle ⁇ of the combiner 13 shown in FIG. 9B, new virtual image distortion occurs and It can be seen that the difference in the shape of the observed virtual image is large.
- FIG. 17B shows an example of simulation results when the field lens 12 is installed between the EPE 11 and the combiner 13. From FIG. 17 (b), compared with FIG. 17 (a), the bow distortion of the virtual image is suppressed, but the initial state before changing the tilt angle ⁇ of the combiner 13 shown in FIG. 9 (c). Compared with the results in, it can be seen that a new distortion of the virtual image has occurred, and that the difference in the shape of the virtual image observed with both eyes has increased.
- An example of a simulation result when changed is shown.
- FIG. 17C shows a simulation result example when the field lens 12 is not installed between the EPE 11 and the combiner 13. From FIG. 17 (c), compared with FIG. 17 (a), the distortion of the virtual image is appropriately reduced, and the difference in the shape of the virtual image observed by each of both eyes is appropriately reduced. Recognize.
- FIG. 17D shows a simulation result example when the field lens 12 is installed between the EPE 11 and the combiner 13. From FIG. 17D, compared to FIG. 17B, the distortion of the virtual image is appropriately reduced, and the difference in the shape of the virtual image observed with both eyes is appropriately reduced. Recognize. Further, as compared with FIG. 17C, it can be seen that the bow distortion of the virtual image is appropriately removed.
- the tilt angle ⁇ of the EPE 11 is appropriately changed (specifically, approximately 2 of the changed tilt angle ⁇ ).
- the distortion of the virtual image caused by the change in the tilt angle ⁇ of the combiner 13 can be appropriately suppressed.
- the tilt angle ⁇ of the EPE 11 may not be strictly changed to twice the changed tilt angle ⁇ . That is, the tilt angle ⁇ of the EPE 11 may be changed to an angle near twice the tilt angle ⁇ after the change in the combiner 13.
- the third embodiment can be implemented in combination with the second embodiment described above. That is, when the tilt angle ⁇ of the combiner 13 is changed, control is performed to change the tilt angle ⁇ of the EPE 11 to approximately twice the changed tilt angle ⁇ , and the light emitted from the laser projector 1 is controlled on the EPE 11. It is possible to perform processing for correcting the distortion of the image formed on the surface.
- Modification 1 In the above-described embodiments, the display devices 100 and 101 configured to allow the user to look up the virtual image are shown, but the first modification relates to a display device configured to allow the user to look down on the virtual image.
- FIG. 18 is a block diagram schematically showing the overall configuration of the display device 102 according to the first modification.
- symbol is attached
- the display device 102 according to the modified example 1 is different from the display device 100 according to the first example in that the virtual image can be observed when the user looks down.
- the EPE 11 and the combiner 13 are tilted in the counterclockwise direction with respect to the axes b and c so that the virtual image can be observed when the user looks up.
- the EPE 11 and the combiner 13 are rotated clockwise with respect to the axes b and c so that the virtual image can be observed when the user looks down. Tilt with tilt angles ⁇ and ⁇ .
- the tilt angle ⁇ of the EPE 11 is set to approximately twice the tilt angle ⁇ of the combiner 13 as in the first embodiment (in the first modification, the axes b, c The clockwise direction is defined as “positive”).
- Modification 2 is different from the above-described embodiment in that a mirror is provided between the EPE 11 and the combiner 13.
- FIG. 19 is a block diagram schematically showing the overall configuration of the display device 103 according to the second modification.
- symbol is attached
- a plane mirror 19 is provided between the EPE 11 and the combiner 13.
- the plane mirror 19 is disposed between the field lens 12 and the combiner 13.
- the plane mirror 19 reflects the light from the field lens 12 and makes it incident on the combiner 13.
- the plane mirror 19 makes light that has passed through the center of the EPE 11 enter at an incident angle of 45 degrees, and reflects the light at a reflection angle of 45 degrees to enter the center of the combiner 13.
- Such a plane mirror 19 is provided from the viewpoint of shortening the overall length of the display device 103, for example.
- an axis corresponding to the traveling direction of light incident on the plane mirror 19 from the center C1 of the EPE 11 is defined as “a1”
- an axis corresponding to the traveling direction of light incident on the center C2 of the combiner 13 from the plane mirror 19. Is defined as “a2”.
- the axis a1 and the axis a2 correspond to axes obtained by bending the axis a (see FIG. 3) connecting the center C1 of the EPE 11 and the center C2 of the combiner 13 described above.
- an angle at which the EPE 11 is inclined with respect to the axis b1 that is orthogonal to the axis a1 and passes through the center C1 of the EPE 11 is defined as a tilt angle ⁇ of the EPE 11.
- the angle at which the combiner 13 is tilted with respect to the axis c1 that is orthogonal to the axis a2 and passes through the center C2 of the combiner 13 is defined as the tilt angle ⁇ of the combiner 13.
- the tilt angle ⁇ of the EPE 11 the counterclockwise direction with respect to the axis b 1 is defined as “positive”, and for the tilt angle ⁇ of the combiner 13, the clockwise direction with respect to the axis c 1 is defined as “positive”. It is defined as
- the tilt angles ⁇ and ⁇ are defined in this way, the tilt angles ⁇ and ⁇ that can appropriately suppress the distortion of the virtual image are “ ⁇ 2 ⁇ ” as in the first embodiment. Therefore, also in the modified example 2, the tilt angle ⁇ of the EPE 11 is set to approximately twice the tilt angle ⁇ of the combiner 13.
- the tilt angle ⁇ of the EPE 11 is set to approximately twice the tilt angle ⁇ of the combiner 13 (however, the direction in which the EPE 11 and the combiner 13 are tilted (clockwise or counterclockwise depending on the manner in which the mirror is provided). ) Will change).
- Modification 3 is different from the above-described embodiment in that a liquid crystal display is used instead of the laser projector 1 and the EPE 11.
- FIG. 20 is a block diagram schematically showing the overall configuration of the display device 104 according to the third modification.
- symbol is attached
- the display device 104 according to the modification 3 is different from the display device 100 according to the first embodiment in that it includes a liquid crystal display 200 instead of the laser projector 1 and the EPE 11.
- the liquid crystal display 200 corresponds to an example of the “image forming element” of the present invention.
- the liquid crystal display 200 is also disposed at a tilt angle ⁇ that is approximately twice the tilt angle ⁇ of the combiner 13 as in the first embodiment.
- liquid crystal display 200 instead of the laser projector 1 and EPE11.
- an organic EL display can be used in place of the laser projector 1 and the EPE 11.
- the organic EL display corresponds to an example of the “image forming element” of the present invention.
- the configuration of the field lens arranged on the emission side of the EPE 11 is different from the above-described embodiment.
- the surface on which light from EPE 11 is incident that is, the incident surface
- the surface opposite to the surface on which light from EPE 11 is incident. That is, the exit surface
- the field lens according to Modification 4 has a biconvex spherical shape.
- the field lens according to the modification 4 is arranged such that an axis connecting the center of the field lens and the center of the EPE 11 is parallel to an axis extending in the vertical direction of the EPE 11.
- the light emitted from the field lens enters the combiner 13.
- the field lens according to Modification 4 corresponds to an example of “virtual image distortion correction element” in the present invention.
- FIG. 21 shows an example of a simulation result when the field lens according to Modification 4 is used. It is assumed that the simulation was performed using the following parameters. -Distance between EPE 11 and field lens entrance surface: 2 mm (EPE 11 and the center of the field lens coincide) -Distance Y1: 170 mm between the center C1 of the EPE 11 and the center C2 of the combiner 13 -Distance Y2 between the center C2 of the combiner 13 and the eye point: 500 mm -Curvature radius of combiner 13: 400 mm (combiner 13 is spherical) -Tilt ⁇ of the combiner 13: 12 degrees-Tilt ⁇ of the EPE 11: 24 degrees-Intermediate image size on the EPE 11: horizontal length 75 mm x vertical length 25 mm -Radius of curvature on the incident side of the field lens: 800mm -Radius of curvature on exit side of field lens: 170mm ⁇ Center thickness of field
- 21A shows an example of a simulation result at the eye box center when the field lens according to the modification 4 is used
- FIG. 21B shows the case when the field lens according to the modification 4 is used.
- An example of simulation results with both eyes is shown. 21 (a) and 21 (b), it can be seen that virtual image distortion (such as bow distortion) can be appropriately suppressed even when a field lens configured with a biconvex spherical surface is used.
- the present invention can be used for a display device that visually recognizes an image as a virtual image, such as a head-up display.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
Abstract
Description
まず、本発明の第1実施例について説明する。
図1は、第1実施例に係る表示装置100の全体構成を概略的に示したブロック図である。ここでは、表示装置100の各構成要素の概要について説明する。
次に、図3を参照して、EPE11のチルト角β及びコンバイナ13のチルト角αについて説明する。なお、チルト角βは本発明における「第1角度」に相当し、チルト角αは本発明における「第2角度」に相当する。
次に、EPE11のチルト角βをコンバイナ13のチルト角αの略2倍に設定した場合のシミュレーション結果について説明する。
・コンバイナ13のチルト角α:12度
・EPE11とフィールドレンズ12の平面との間隔:2mm
・EPE11の中心C1とコンバイナ13の中心C2との間隔Y1:170mm
・コンバイナ13の中心C2とアイポイントとの間隔Y2:500mm
・EPE11上に形成された中間像のサイズ:水平方向の長さ75mm×垂直方向の長さ25mm(なお、中間像の形状は、図5に示すように、歪みの無い正方形が並べられた格子像である)
・コンバイナ13の凹面の曲率半径:400mm
・フィールドレンズ12の凸面の曲率半径:150mm
・フィールドレンズ12の中心厚み:8mm
・フィールドレンズ12の屈折率:1.526
まず、図6及び図7を参照して、アイボックスセンターでのシミュレーション結果について説明する。ここでは、図6に示すように、アイボックスセンターから「Y3=1500mm」だけ前方の位置に虚像が形成される場合について例示する。ここで、アイボックスセンターとは両眼のほぼ中央であり、虚像をほぼ真正面から観測できる理想的な観測点を指す。
・EPE11上の中間像サイズ:水平方向の長さ75mm×垂直方向の長さ25mm
・目から虚像までの距離:1500mm
・コンバイナ13のチルト角α:12度
・R400のコンバイナ13の曲率半径:400mm
・R400のコンバイナ13とEPE11との間隔:170mm
・R400のコンバイナ13に用いた参照スクリーンサイズ:水平方向の長さ450mm×垂直方向の長さ150mm
・R500のコンバイナ13の曲率半径:500mm
・R500のコンバイナ13とEPE11との間隔:200mm
・R500のコンバイナ13に用いた参照スクリーンサイズ:水平方向の長さ360mm×垂直方向の長さ120mm
図11は、R400及びR500のシミュレーション結果の一例を示す。ここでは、フィールドレンズ12を用いなかった場合の結果を示す。図11(a)は、EPE11のチルト角を種々の値に設定した場合の、R400及びR500での虚像の歪量を示している。「虚像の歪量」は、図11(b)及び(c)中の一点鎖線X1で示すような枠の縦線に対する、図11(b)及び(c)中の実線矢印X2で示すような虚像端部の縦線の角度の絶対値にて定義される。図11(b)は、R400を用いた場合の、EPE11のチルト角βを0度に設定した際のアイボックスセンターで観察される虚像歪みを示し、図11(c)は、R400を用いた場合の、EPE11のチルト角βを32度に設定した際のアイボックスセンターで観察される虚像歪みを示している。
次に、本発明の第2実施例について説明する。第2実施例では、レーザプロジェクタ1から照射される光によってEPE11上に形成される像の歪みを補正する点で、第1実施例と異なる。具体的には、第2実施例では、レーザプロジェクタ1より照射された光がEPE11に垂直に入射しないような配置状態である場合に、レーザプロジェクタ1より照射された光がEPE11に垂直に入射しないことにより生じ得る、EPE11上に形成される像の歪みを補正する。
次に、本発明の第3実施例について説明する。第3実施例では、コンバイナ13のチルト角αが変化された場合に、変化後のチルト角αに応じてEPE11のチルト角βを自動で変化させる点で、上記した第1及び第2実施例と異なる。具体的には、第3実施例では、EPE11とコンバイナ13との相対的な角度関係を所定の条件に維持したまま、EPE11のチルト角βを変化させる制御を行う。詳しくは、「β≒2α」といった条件が満たされるように、コンバイナ13のチルト角αの変化に応じて、EPE11のチルト角βを変化させる制御を行う。
次に、上記した実施例の変形例について説明する。なお、以下で提示する変形例は、上記した第1乃至第3実施例に適宜組み合わせて実施することができると共に、それぞれを適宜組み合わせて実施することができる。
上記した実施例では、ユーザが虚像を見上げるように構成された表示装置100、101を示したが、変形例1は、ユーザが虚像を見下ろすように構成された表示装置に関する。
変形例2は、EPE11とコンバイナ13との間にミラーが設けられている点で、上記した実施例と異なる。
変形例3は、レーザプロジェクタ1及びEPE11の代わりに、液晶ディスプレイを用いる点で、上記した実施例と異なる。
変形例4は、EPE11の射出側に配置されたフィールドレンズの構成が、上記した実施例と異なる。具体的には、変形例4に係るフィールドレンズは、EPE11からの光が入射する面(つまり入射面)が凸面形状に構成されていると共に、EPE11からの光が入射する面と反対側の面(つまり出射面)が凸面形状に構成されている。即ち、変形例4に係るフィールドレンズは、両凸球面の形状を有している。また、変形例4に係るフィールドレンズは、その中心とEPE11の中心とを結んだ軸が、EPE11の鉛直方向に延びる軸と平行になるように配置されている。フィールドレンズから出射された光は、コンバイナ13に入射する。なお、変形例4に係るフィールドレンズも、本発明における「虚像歪補正素子」の一例に相当する。
・EPE11とフィールドレンズ入射面との間隔:2mm(EPE11とフィールドレンズの中心は一致するものとする)
・EPE11の中心C1とコンバイナ13の中心C2との間隔Y1:170mm
・コンバイナ13の中心C2とアイポイントとの間隔Y2:500mm
・コンバイナ13の曲率半径:400mm(コンバイナ13の形状は球面)
・コンバイナ13のチルトα:12度
・EPE11のチルトβ:24度
・EPE11上の中間像サイズ:水平方向の長さ75mm×垂直方向の長さ25mm
・フィールドレンズの入射側の曲率半径:800mm
・フィールドレンズの射出側の曲率半径:170mm
・フィールドレンズの中心厚み:8mm
・フィールドレンズの屈折率:1.526
図21(a)は、変形例4に係るフィールドレンズを用いた場合のアイボックスセンターでのシミュレーション結果の一例を示し、図21(b)は、変形例4に係るフィールドレンズを用いた場合の両眼でのシミュレーション結果の一例を示している。図21(a)及び(b)より、両凸球面に構成されたフィールドレンズを用いた場合にも、虚像の歪み(弓なりの歪みなど)を適切に抑制できていることがわかる。
上記した実施例では、本発明をレーザプロジェクタ1に適用する例を示したが、本発明は、レーザプロジェクタ1以外にも、液晶プロジェクタなどの種々のプロジェクタに適用することができる。
上記した実施例では、球面形状に構成されたフィールドレンズ12及びコンバイナ13に本発明を適用する例を示したが、本発明は、非球面形状に構成されたフィールドレンズ及び/又はコンバイナにも適用することができる。また、上記した実施例では、平面形状に構成されたEPE11に本発明を適用する例を示したが、EPE11を平面形状に構成することに限定はされない。
11 射出瞳拡大素子(EPE)
12 フィールドレンズ
13 コンバイナ
15 角度センサ
16 アクチュエータ
19 平面ミラー
100 表示装置
Claims (18)
- 表示すべき画像を形成する画像形成素子と、
前記画像形成素子から出射された光を反射することで虚像を表示させる光学素子と、を備え、
前記画像形成素子の中心から前記光学素子の中心へと進む光の進行方向に対する前記画像形成素子の角度である第1角度が、前記光学素子によって表示される前記虚像の歪みが小さくなるように、前記光の進行方向に対する前記光学素子の角度である第2角度に応じて設定されていることを特徴とする表示装置。 - 前記光の進行方向に対する前記光学素子の角度の変化に起因する前記虚像の歪みを小さくするように、前記光の進行方向に対する前記画像形成素子の角度を変化させる角度変更手段を更に備えることを特徴とする請求項1に記載の表示装置。
- 前記光学素子の角度を変化させる手段を更に備えることを特徴とする請求項2に記載の表示装置。
- 前記第1角度は、前記第2角度よりも大きな角度となるように設定されていることを特徴とする請求項1乃至3のいずれか一項に記載の表示装置。
- 前記第1角度は、前記第2角度の略2倍の角度となるように設定されていることを特徴とする請求項1乃至4のいずれか一項に記載の表示装置。
- 前記光学素子は、前記画像形成素子から出射された光の進行方向に向かって凹形状を有しており、
前記画像形成素子は、平面形状であることを特徴とする請求項1乃至5のいずれか一項に記載の表示装置。 - 前記角度変更手段は、前記虚像の台形歪み及び/又は前記画像形成素子上に形成された画像と前記虚像とのアスペクト比の違いを小さくするように、前記光の進行方向に対する前記画像形成素子の角度を変化させることを特徴とする請求項2に記載の表示装置。
- 前記角度変更手段は、前記画像形成素子と前記光学素子との相対的な角度関係を所定の条件に維持したまま、前記光の進行方向に対する前記画像形成素子の角度を変化させることを特徴とする請求項2に記載の表示装置。
- 前記角度変更手段は、前記光の進行方向に対する前記画像形成素子の角度を、前記光の進行方向に対する前記光学素子の角度の略2倍に変化させることを特徴とする請求項2に記載の表示装置。
- 前記画像形成素子と前記光学素子との間に、虚像歪補正素子が配置されていることを特徴とする請求項1乃至9のいずれか一項に記載の表示装置。
- 前記虚像歪補正素子は、前記光学素子が曲率を有することに起因する、前記虚像の弓なりの歪みを補正することを特徴とする請求項10に記載の表示装置。
- 前記画像形成素子は、光源から照射された光の射出瞳を拡大する射出瞳拡大素子であることを特徴とする請求項1乃至11のいずれか一項に記載の表示装置。
- 前記光源から照射された光によって前記画像形成素子上に形成される像の歪みを補正する補正手段を更に備えることを特徴とする請求項12に記載の表示装置。
- 前記補正手段は、前記光源より照射された光の前記画像形成素子に対する角度に起因する、前記像の歪みを補正することを特徴とする請求項13に記載の表示装置。
- 前記画像形成素子は、液晶ディスプレイであることを特徴とする請求項1乃至11のいずれか一項に記載の表示装置。
- 前記画像形成素子は、有機ELディスプレイであることを特徴とする請求項1乃至11のいずれか一項に記載の表示装置。
- 前記第1角度は、前記画像形成素子の中心から前記光学素子の中心へと進む光の進行方向に直行し、且つ前記画像形成素子の中心を通る軸に対して、前記画像形成素子がなす角度であり、
前記第2角度は、前記画像形成素子の中心から前記光学素子の中心へと進む光の進行方向に直行し、且つ前記光学素子の中心を通る軸に対して、前記光学素子がなす角度であることを特徴とする請求項1乃至16のいずれか一項に記載の表示装置。 - 表示すべき画像を形成する画像形成素子と、
前記画像形成素子から出射された光を反射することで虚像を表示させる光学素子と、
前記画像形成素子の中心から前記光学素子の中心へと進む光の進行方向に対する前記光学素子の角度を変化させる第1角度変更手段と、
前記光の進行方向に対する前記光学素子の角度の変化に起因する前記虚像の歪みを小さくするように、前記光の進行方向に対する前記画像形成素子の角度を変化させる第2角度変更手段と、を備えることを特徴とする表示装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2011/065187 WO2013005278A1 (ja) | 2011-07-01 | 2011-07-01 | 表示装置 |
US14/129,358 US20140145913A1 (en) | 2011-07-01 | 2011-07-01 | Display device |
EP20110869139 EP2728394A4 (en) | 2011-07-01 | 2011-07-01 | DISPLAY DEVICE |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2011/065187 WO2013005278A1 (ja) | 2011-07-01 | 2011-07-01 | 表示装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013005278A1 true WO2013005278A1 (ja) | 2013-01-10 |
Family
ID=47436652
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/065187 WO2013005278A1 (ja) | 2011-07-01 | 2011-07-01 | 表示装置 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20140145913A1 (ja) |
EP (1) | EP2728394A4 (ja) |
WO (1) | WO2013005278A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015197496A (ja) * | 2014-03-31 | 2015-11-09 | アイシン・エィ・ダブリュ株式会社 | 虚像表示装置 |
JP2015197495A (ja) * | 2014-03-31 | 2015-11-09 | アイシン・エィ・ダブリュ株式会社 | 虚像表示装置 |
JP2015232702A (ja) * | 2014-05-14 | 2015-12-24 | 株式会社デンソー | ヘッドアップディスプレイ |
WO2016185992A1 (ja) * | 2015-05-15 | 2016-11-24 | シャープ株式会社 | 表示装置 |
EP3096178A4 (en) * | 2014-03-27 | 2017-02-01 | Panasonic Intellectual Property Management Co., Ltd. | Display apparatus |
JPWO2018225408A1 (ja) * | 2017-06-08 | 2019-11-07 | 株式会社Nttドコモ | 眼鏡型画像表示装置 |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018150922A1 (ja) * | 2017-02-15 | 2018-08-23 | マクセル株式会社 | ヘッドアップディスプレイ装置 |
JP6830182B2 (ja) * | 2017-03-08 | 2021-02-17 | パナソニックIpマネジメント株式会社 | 画像投写装置 |
JP6715484B2 (ja) * | 2018-03-29 | 2020-07-01 | パナソニックIpマネジメント株式会社 | 画像表示システム、画像表示方法、画像表示プログラム、移動体 |
US10609364B2 (en) * | 2018-04-06 | 2020-03-31 | Facebook Technologies, Llc | Pupil swim corrected lens for head mounted display |
US11143861B2 (en) * | 2018-07-17 | 2021-10-12 | Google Llc | Systems, devices, and methods for laser projection in wearable heads-up displays |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1048562A (ja) * | 1996-07-30 | 1998-02-20 | Asahi Glass Co Ltd | ホログラフィック表示装置 |
JPH11249061A (ja) * | 1998-02-27 | 1999-09-17 | Asahi Glass Co Ltd | 情報表示装置 |
JPH11337862A (ja) * | 1998-05-28 | 1999-12-10 | Mitsubishi Electric Corp | ヘッドアップディスプレイ |
JP2000347127A (ja) * | 1999-06-04 | 2000-12-15 | Nippon Soken Inc | 車両用ヘッドアップディスプレイ装置 |
JP2010217372A (ja) * | 2009-03-16 | 2010-09-30 | Funai Electric Co Ltd | レーザプロジェクタ |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5220363A (en) * | 1988-09-14 | 1993-06-15 | Casio Computer Co., Ltd. | Projector |
TWI221749B (en) * | 2000-05-23 | 2004-10-01 | Nagase & Co Ltd | Organic EL display and method for manufacturing organic EL display |
JP5050862B2 (ja) * | 2008-01-09 | 2012-10-17 | 株式会社デンソー | 画像形成装置 |
JP2009246505A (ja) * | 2008-03-28 | 2009-10-22 | Toshiba Corp | 画像表示装置及び画像表示装置 |
-
2011
- 2011-07-01 EP EP20110869139 patent/EP2728394A4/en not_active Withdrawn
- 2011-07-01 WO PCT/JP2011/065187 patent/WO2013005278A1/ja active Application Filing
- 2011-07-01 US US14/129,358 patent/US20140145913A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1048562A (ja) * | 1996-07-30 | 1998-02-20 | Asahi Glass Co Ltd | ホログラフィック表示装置 |
JPH11249061A (ja) * | 1998-02-27 | 1999-09-17 | Asahi Glass Co Ltd | 情報表示装置 |
JPH11337862A (ja) * | 1998-05-28 | 1999-12-10 | Mitsubishi Electric Corp | ヘッドアップディスプレイ |
JP2000347127A (ja) * | 1999-06-04 | 2000-12-15 | Nippon Soken Inc | 車両用ヘッドアップディスプレイ装置 |
JP2010217372A (ja) * | 2009-03-16 | 2010-09-30 | Funai Electric Co Ltd | レーザプロジェクタ |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3096178A4 (en) * | 2014-03-27 | 2017-02-01 | Panasonic Intellectual Property Management Co., Ltd. | Display apparatus |
JP2015197496A (ja) * | 2014-03-31 | 2015-11-09 | アイシン・エィ・ダブリュ株式会社 | 虚像表示装置 |
JP2015197495A (ja) * | 2014-03-31 | 2015-11-09 | アイシン・エィ・ダブリュ株式会社 | 虚像表示装置 |
JP2015232702A (ja) * | 2014-05-14 | 2015-12-24 | 株式会社デンソー | ヘッドアップディスプレイ |
WO2016185992A1 (ja) * | 2015-05-15 | 2016-11-24 | シャープ株式会社 | 表示装置 |
JPWO2018225408A1 (ja) * | 2017-06-08 | 2019-11-07 | 株式会社Nttドコモ | 眼鏡型画像表示装置 |
Also Published As
Publication number | Publication date |
---|---|
US20140145913A1 (en) | 2014-05-29 |
EP2728394A4 (en) | 2015-01-21 |
EP2728394A1 (en) | 2014-05-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2013005278A1 (ja) | 表示装置 | |
JP5214060B1 (ja) | 虚像表示装置 | |
JP6551730B2 (ja) | 画像表示装置及び移動体 | |
JP6478151B2 (ja) | 画像表示装置及び物体装置 | |
JP4769912B1 (ja) | 光学素子、ヘッドアップディスプレイ及び光学素子の製造方法 | |
US10613321B2 (en) | Image display apparatus | |
JP6617947B2 (ja) | 画像表示装置及び画像表示システム | |
JP6908898B2 (ja) | 画像表示装置 | |
JP2017016006A (ja) | 光走査装置、画像表示装置 | |
EP3006988B1 (en) | Image display apparatus | |
WO2013051086A1 (ja) | ヘッドアップディスプレイ | |
JP2016130759A (ja) | 画像表示装置 | |
JP6611310B2 (ja) | 車両用投影表示装置 | |
JP2017083631A (ja) | 表示装置、制御方法、プログラム及び記憶媒体 | |
JP2009192561A (ja) | 画像表示装置 | |
JP2019158991A (ja) | 表示装置、表示システムおよび移動体 | |
JPWO2013005278A1 (ja) | 表示装置 | |
JP6737370B2 (ja) | 投影装置 | |
JP5666003B2 (ja) | 光源ユニット、及び光源ユニットの製造方法 | |
WO2013179494A1 (ja) | 投影装置、ヘッドアップディスプレイ、制御方法、プログラム及び記憶媒体 | |
JP7017083B2 (ja) | 画像表示装置及び移動体装置 | |
WO2013145153A1 (ja) | 画像描画装置 | |
JP2016053680A (ja) | 画像表示再生装置および射出瞳拡大方法 | |
JP2015219389A (ja) | 虚像表示装置及び画像形成素子 | |
JP2012247603A (ja) | 光ビーム走査装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11869139 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013522617 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011869139 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14129358 Country of ref document: US |