USRE43490E1 - Wide-angle dewarping method and apparatus - Google Patents
Wide-angle dewarping method and apparatus Download PDFInfo
- Publication number
- USRE43490E1 USRE43490E1 US12/118,570 US11857008A USRE43490E US RE43490 E1 USRE43490 E1 US RE43490E1 US 11857008 A US11857008 A US 11857008A US RE43490 E USRE43490 E US RE43490E
- Authority
- US
- United States
- Prior art keywords
- image
- control points
- perspective corrected
- distorted image
- vectors
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related, expires
Links
- 238000000034 method Methods 0.000 title claims abstract description 58
- 239000013598 vector Substances 0.000 claims abstract description 34
- 230000001131 transforming effect Effects 0.000 claims abstract description 14
- 230000009466 transformation Effects 0.000 claims description 29
- 230000006870 function Effects 0.000 claims description 8
- 238000003384 imaging method Methods 0.000 claims description 5
- 230000007246 mechanism Effects 0.000 claims description 3
- 238000003860 storage Methods 0.000 claims description 3
- 238000012937 correction Methods 0.000 claims description 2
- 241000251468 Actinopterygii Species 0.000 claims 1
- 238000013507 mapping Methods 0.000 description 17
- 230000000694 effects Effects 0.000 description 5
- 230000033001 locomotion Effects 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000002452 interceptive effect Effects 0.000 description 3
- 238000009877 rendering Methods 0.000 description 3
- 238000000844 transformation Methods 0.000 description 3
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 229910052710 silicon Inorganic materials 0.000 description 2
- 239000010703 silicon Substances 0.000 description 2
- 239000000758 substrate Substances 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000004886 head movement Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 238000013139 quantization Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/04—Context-preserving transformations, e.g. by using an importance map
- G06T3/047—Fisheye or wide-angle transformations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2628—Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
Definitions
- the present invention relates to a method and apparatus for displaying a perspective corrected field of view from wide angle video sources, and more particularly relates to permitting the user of an orientation sensing means to view a selected portion of stored or real time video encoded from a wide angle source and transforming that portion to a perspective-corrected field of view.
- VR virtual reality
- Telepresence a user is permitted to view a computer-generated graphical representation of a selected environment. Depending on the sophistication of the hardware and software used to generate the virtual reality environment, the user may be treated to a three dimensional view of the simulated environment.
- Telepresence a user is permitted to view a real-world, live or recorded environment from a three dimensional perspective.
- the user is permitted to see different portions of the VR and telepresence environments simply by moving or orienting his head in one or more degrees of freedom. This permits the user to obtain the sensation that he is immersed in the computer-generated/real-world environment.
- High end devices detect pan, roll and tilt motions by the user and cause the environment to change accordingly.
- the pan ⁇ tilt ⁇ roll motions may be inputted by many types of input devices, such as joysticks, buttons or head orientation sensors (which may be connected to head mounted displays).
- a continuing problem with the prior art is how to encode sufficient data that a viewer may arbitrarily move his viewing perspective within the telepresence environment and not look beyond the field of view.
- One relatively simple solution, where the telepresence environment is based on a real three dimensional environment, is to simply use the head orientation sensors to cause a camera to track the orientation of the viewer. This has obvious limitations in that only one viewer can be in the telepresence environment at a time (since the camera can only track one viewer, and the other viewers will not typically be able to follow the head motions of the controlling viewer) and, also, prerecorded data cannot be used. Further, there is an inherent delay between a change in user viewing perspective and the time that it takes to realign the corresponding camera. These limitations greatly restrict the value of such systems.
- One method for overcoming each of these limitations is to encode, either in real time or by pre-recording, a field of view largely equivalent to the entire range of motion vision of a viewer—that is, what the viewer would see if he moved his head in each permitted direction throughout the entire permissible range. For example, encoding substantially a full hemisphere of visual information would permit a plurality of viewers a reasonable degree of freedom to interactively look in a range of directions within the telepresence environment.
- a typical approach for encoding substantially a full hemisphere of information involves using a fish-eye lens.
- Fish-eye lenses by their nature, convert a three dimensional scene to a two-dimensional representation by compressing the data at the periphery of the field of view.
- the visual data For the information to be viewed comfortably by a viewer in the VR environment, the visual data must be decompressed, or dewarped, so that it is presented in normal perspective as a two dimensional representation.
- the present invention overcomes the limitations of the prior art.
- the present invention transforms a plurality of viewing vectors within a selected portion of the wide angle, three dimensional video input into two dimensional control points and uses a comparatively simple method to transform the image between the control points to create a perspective-corrected field of view.
- the present invention is drawn to a method and apparatus which provides perspective corrected views of live, prerecorded or simulated wide angle environments.
- the present invention first captures a wide angle digital video input by any suitable means, such as through the combination of a high resolution video camera, hemispherical fisheye lens and real time digital image capture board.
- the captured image is then stored in a suitable memory means so portions of the image may be selected at a later time.
- a portion of the stored video When a portion of the stored video is selected, a plurality of discrete viewing vectors in three dimensional space are chosen and transformed into a plurality of control points in a corresponding two dimensional plane.
- the area between the control points which is still warped from the original wide angle image capture, is then transformed into a perspective corrected field of view through a biquadratic polynomial mapping technique.
- the perspective corrected field of view is then displayed on a suitable displaying apparatus, such as a monitor or head mounted display.
- the present invention further has the ability to sense an inputted selection, orientation and magnification of a new portion of the stored video for transformation.
- the present invention provides a dependable, low cost, faster and more elegantly simple solution to dewarping wide angle three dimensional images.
- the present invention also allows for simultaneous dynamic transformation of wide angle video to multiple viewers and provides each user with the ability to access and manipulate the same or different portions of the video input.
- the present invention also allows the computer generated three dimensional polygons to be rendered in advance; thus, users may view the environments from any orientation quickly and without expensive rendering hardware.
- FIG. 1 shows a functional block diagram of one embodiment of the present invention.
- FIG. 2 diagrams the geometry between three dimensional (X-Y-Z) space and its corresponding two dimensional (U-V) plane.
- FIG. 3a shows a bilinear mapping of a warped image.
- FIG. 3b shows a biquadratic mapping of a warped image.
- FIG. 4 shows a side view of a viewing vector from a three dimensional (X-Y-Z) wide angle lens as it is seen on a two dimensional (U-V) plane.
- FIG. 5 shows a three dimensional field of view along with a plurality of viewing vectors according to the present invention.
- FIG. 6 shows a block diagram of the elements of a forward texture mapping ASIC according to the present invention.
- FIG. 7 shows an example of a U-V source texture transformed into a X-Y plane destination texture according to the present invention.
- FIG. 8 is one embodiment of how to obtain a 360 degree view using six hemispherical fisheye lenses according to the present invention.
- FIG. 9 is a functional flow chart of one embodiment of the present invention.
- a high resolution video camera 10 having a wide angle lens 20 such as a hemispherical fisheye lens, is directed to a real world scene 22 .
- the output 24 of the camera 10 is provided to a real time image digitizing board 30 , commonly referred to as a “frame grabber,” located in or operatively connected to a conventional high speed computer indicated generally at 150 .
- the camera 10 may be any camera which is capable of using a wide angle lens and providing suitable resolution. In most instances the camera will be a video camera, although in some instances it may be desirable to use a still frame camera.
- the computer 150 is any computer capable of receiving and processing video information at an acceptable rate and may, for example, be an 80486-based or PentiumTM-based system, or other computer platform such as are made by Silicon Graphics, Sun Micro Systems, Apple Computer, or similar other computer manufacturers.
- the fisheye lens 20 causes the video output signal 24 from the camera 10 to be optically warped in a non-linear manner. Before the image can be comfortably viewed by a user, perspective-correcting measures must be taken.
- the digitized video signal 24 is thus transferred through the digitizing board 30 (typically but not necessarily operating at 30 frames per second) into memory 40 of the computer 150 so that portions of the video picture can be randomly accessed by a microprocessor 50 , also within the computer 150 , at any time.
- the dewarping software is also stored in memory 40 and is applied to the video signal 24 by the microprocessor 50 .
- the stored video signal is then transmitted from memory 40 to a special purpose ASIC 60 capable of biquadratic or higher order polynomial transformations for texture warping and interpolation.
- the texture warping ASIC 60 may be omitted and its functionality may be performed by software. Phantom lines have been used to show the optional nature of ASIC 60 .
- the perspective corrected video signal is next transmitted to a video output stage 70 , such as a standard VGA card, and from there displayed on a suitable monitor, head mounted display or the like 80 .
- An input device 90 such as a joystick or headtracker (which senses the head movements of a user wearing a headmounted display), transmits position information through a suitable input port 100 , such as a standard serial, parallel or game port, to the microprocessor 50 to control the portion of the stored video that is selected, dewarped and displayed.
- the input device 90 also transmits roll/pitch/yaw information to the microprocessor 50 so that a user may control the orientation of the dewarped video signal.
- a magnification option could be added to the input device 90 to allow the user to magnify the selected portion of video input, constrained only by the resolution of the camera 10 .
- FIG. 2 shows a real world three dimensional environment 200 which has been imaged by the wide angle lens 20 .
- This environment is defined by the Cartesian coordinate system in X, Y and Z with the viewpoint defined to be the origin of the coordinate system.
- the viewing direction of the user as defined by the input device 90 , is given as a viewing vector in the X-Y-Z coordinate system.
- the image plane 210 containing the warped wide angle image is defined by a two dimensional coordinate system in U and V, with the origin of the coordinate system coincident with the origin of the X-Y-Z coordinate system. If the field of view of the lens 20 is sufficient, and the lens is rotationally symmetric about the viewing axis, the digitized warped image will be roughly circular in the U-V plane.
- the first generation of ASICs developed for low-cost texture mapping of three dimensional graphics, mapped video images through a bilinear technique, such as is shown in FIG. 3(a) .
- These chips were able to apply linear interpolation to texture pixels in both the X and Y directions and could thereby stretch rectangular source textures to any two dimensional quadrilateral shape.
- An example of a chip of this type is the Artist Graphics 3GA chip.
- These bilinear chips do, however, introduce texture errors for polygons whose vertices have been subject to significant amounts of perspective, and further are not capable of sufficiently high order texture distortion to adequately flatten extreme wide angle views, such as with hemispherical fisheye lenses.
- FIG. 3b shows an example of a biquadratic technique, such as is now coming onto the market.
- the preferred embodiment of the present invention uses an ASIC chip which implements a texture warping technique of at least second polynomial order.
- the present invention is of sufficient simplicity that this technique could also be implemented in software on a general purpose high speed computer, such as a Silicon Graphics IndigoTM computer or a PentiumTM based computer.
- the warped image in the U-V plane, shown in FIG. 2 has a radius 220 equal to RADIUS pixels with an origin at UORIGIN and VORIGIN.
- Equations (1) convert an inputted X-Y-Z three dimensional viewing vector into a corresponding control point in the U-V plane.
- N 2 in the above equations
- a pseudo-inverse technique is used.
- the values for a ij and b ij are then found by mapping the points in the U-V plane for the 3 ⁇ 3 grid of control points using the above equations (6).
- the biquadratic polynomial transformations of the equations (3) are then used to transform the area between the control points.
- the determination of the coordinates of each pixel in the U-V plane takes a total of thirteen multiplication and ten addition operations.
- three of the required multiplication operations per pixel may be obviated by storing a table of xy, x 2 and y 2 values for each xy coordinate pair in the dewarped destination image.
- the “x” values which do not vary as “y” changes (i.e.
- a 1 *x+a 4 *x 2 and b 1 *x+b 4 *x 2 may also be precomputed and stored.
- the “y” values which do not vary as “x” changes may be precomputed and stored.
- the accuracy of the dewarping transformation will increase as the number of transformed viewing vectors increases, i.e. a 4 ⁇ 4 grid of control points will produce a more accurate transformation than a 3 ⁇ 3 grid of control points.
- the amount of increase in accuracy quickly draws near an asymptote as the number of control points is increased.
- One skilled in the art will recognize, therefore, that there is little reason in increasing the number of viewing vectors to more than half of the total number of pixels in the displayed region.
- FIG. 9 shows a functional flow chart of the major elements of one embodiment of the present invention.
- the fixed warped image parameters are defined, such as the size of the input image, the input image radius, and the input image center in U-V coordinates typically measured in pixels.
- the next step 410 is to initialize the variable dewarped image parameters, such as the size of the dewarped image area, the horizontal and vertical field of views (generally shown in degrees), the creation of an untransformed view cone centered in this embodiment on the +Z axis and the initialization of the layout and number control points used therewith.
- the next step is to load the precomputed inner-loop matrix values as well as the “xy” product terms, as shown in step 420 , to ensure that the transformation is accomplished as quickly and efficiently as possible.
- the video signal is input to the system in any suitable form, i.e. live or pre-recorded real-time digitized video or computer synthesized video environments.
- the system then allows the user to select the viewing vector (step 440 ) which in turn determines the portion of video which is to be transformed.
- the control points are next transformed from the selected viewing vectors (step 450 ) and the region defined by the control points is dewarped (step 460 ).
- the signal is then sent to the video buffer and to an appropriate viewing apparatus (step 470 ).
- the loop also allows the user to make on-the-fly selections of alternate portions of the incoming video.
- the same control points for the U-V plane map to the corners of the display screen in the X-Y plane.
- the warped regions outside the bounding box may be clipped by hardware or software so that they are not visible on the display screen.
- the source pixel coordinates which are fed from the host CPU, are converted to a ij and b ij coordinates for forward mapping in the forward mapping solution stage 240 , again using techniques mathematically equivalent to those of the equations (7).
- a series of instructions is further sent from the host CPU to the chip 230 and received by a control unit 260 .
- the control unit 260 sequences and controls the operation of the other functional stages within the chip 230 .
- the host CPU also directs a linear sequence of source pixels, which are to be warped, to an interpolation sampler stage 250 within chip 230 .
- these can be subject to a low-pass spatial prefiltering stage 270 prior to transmission to the chip, to reduce sampling error during the warping process.
- the source pixels and the a ij and b ij coordinates are both fed to the interpolation sampler 250 .
- the interpolation sampler 250 For each input pixel, one or more destination pixels together with their corresponding X-Y destination coordinates are produced. These warped pixels are then fed into the video frame buffer 280 , located outside of the ASIC chip 230 .
- anti-aliasing circuitry 290 within the chip performs interpolation on output pixel values, such as bilinear interpolation between adjacent pixel samples, to minimize the effects of output spatial quantization error.
- the techniques described herein may also be applied to synthetic images. Such images may be created entirely within a computer environment and may be composed of three dimensional geometrical descriptions of objects which can be produced by computer graphics rendering techniques generally known to those skilled in the art.
- synthetic images are produced by linear perspective projection, emulating the physical process of imaging onto planar film with a lens having a narrow field of view and producing a view of the synthetic environment as seen through a cone or truncated three dimensional pyramid.
- the color, intensity shading and other simulated physical properties of each pixel on the planar image grid can also be readily determined.
- the viewing vectors in X-Y-Z space are rewritten in terms of the warped control points coordinates in the U-V plane
- a direction vector in X-Y-Z space can thus be generated for each pixel in the U-V plane in the synthetic wide angle image which is created.
- the generated vectors point in all directions within the created hemisphere, spaced to the limits of the resolution of the U-V image.
- This technique can be used for the production of three dimensional modeled cartoons or interactive home gaming applications, among others.
- the image substrate for recordation may be an electronic two dimensional image sensor, such as a CCD chip, or photographic film capable of chemically recording the image for subsequent transfer into digital form.
- the present invention is not limited to transforming wide angle video onto a planar (U-V) surface, but that it is within the scope of the invention to transform wide angle video onto any suitable surface for displaying the video for the user.
- two real world, wide angle lenses can be positioned opposite each other to permit near 360 degrees of total coverage of the environment. If seamless omnidirectional coverage of an environment is required, this could be achieved with six wide angle lenses positioned along each direction of a three dimensional axis, as shown in FIG. 8 .
- This arrangement can be coupled with a video switching mechanism for choosing which signal is to be dewarped for the selected view and orientation of the video input.
- the same video signal may be simultaneously transmitted to an arbitrarily large number of viewers all having the ability to simultaneously dewarp the same or different portions of the video input, as in the case of interactive cable TV viewing or multiple player online interactive video game playing.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
Description
In the case of an ideal hemispheric fisheye lens, f(θ)=(RADIUS)(sin(θ)) and the lens equation which results is:
Equations (1) convert an inputted X-Y-Z three dimensional viewing vector into a corresponding control point in the U-V plane.
is then found to describe the geometric correction necessary to transform the region within the warped 3×3 grid in the U-V plane into a perspective corrected field of view. A biquadratic polynomial transformation, N=2 in the above equations, has been selected because a second order polynomial approximates the warping characteristics of most lenses to an adequately high degree of precision and because there is existing hardware to perform the resulting biquadratic transformation. However, it will be appreciated by one skilled in the art that other polynomial transformations of higher degree could be used to increase the precision of the transformation.
The values for v and bij can be similarly found. In matrix form, the expanded equations (4) can be written as:
U=WA
V=WB (5)
To discover aij and bij according to the method of the present invention, a pseudo-inverse technique is used. However, one skilled in the art will appreciate that there are methods to solve equations (5) other than by a pseudo inverse technique, i.e. a least squares technique. The pseudo-inverse solutions for A and B in the above equation (5) are:
A=(WTW)−1WTU
B=(WTW)−1WTV (6)
Therefore, for a target display Cot a given pixel resolution N×M, W and its pseudo-inverse (WTW)−1WT can be calculated a priori. The values for aij and bij are then found by mapping the points in the U-V plane for the 3×3 grid of control points using the above equations (6). The biquadratic polynomial transformations of the equations (3) are then used to transform the area between the control points. In this embodiment, the determination of the coordinates of each pixel in the U-V plane takes a total of thirteen multiplication and ten addition operations. Additionally, three of the required multiplication operations per pixel may be obviated by storing a table of xy, x2 and y2 values for each xy coordinate pair in the dewarped destination image. In another embodiment, the “x” values which do not vary as “y” changes (i.e. a1*x+a4*x2 and b1*x+b4*x2) may also be precomputed and stored. Likewise, the “y” values which do not vary as “x” changes may be precomputed and stored. These further optimizations reduce the operations needed to determine the coordinates of each pixel in the U-V plane to two multiplication and four addition operations.
Claims (42)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/118,570 USRE43490E1 (en) | 1994-05-27 | 2008-05-09 | Wide-angle dewarping method and apparatus |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/250,594 US5796426A (en) | 1994-05-27 | 1994-05-27 | Wide-angle image dewarping method and apparatus |
US09/128,963 US6005611A (en) | 1994-05-27 | 1998-08-04 | Wide-angle image dewarping method and apparatus |
US09/429,697 US6346967B1 (en) | 1994-05-27 | 1999-10-28 | Method apparatus and computer program products for performing perspective corrections to a distorted image |
US10/015,075 US7042497B2 (en) | 1994-05-27 | 2001-12-10 | Wide-angle dewarping method and apparatus |
US12/118,570 USRE43490E1 (en) | 1994-05-27 | 2008-05-09 | Wide-angle dewarping method and apparatus |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/015,075 Reissue US7042497B2 (en) | 1994-05-27 | 2001-12-10 | Wide-angle dewarping method and apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
USRE43490E1 true USRE43490E1 (en) | 2012-06-26 |
Family
ID=26827117
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/118,570 Expired - Fee Related USRE43490E1 (en) | 1994-05-27 | 2008-05-09 | Wide-angle dewarping method and apparatus |
Country Status (1)
Country | Link |
---|---|
US (1) | USRE43490E1 (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100033551A1 (en) * | 2008-08-08 | 2010-02-11 | Adobe Systems Incorporated | Content-Aware Wide-Angle Images |
US20140314336A1 (en) * | 2011-12-19 | 2014-10-23 | Dai Nippon Printing Co., Ltd. | Image processing device, image processing method, program for image processing device, recording medium, and image display device |
US20150160539A1 (en) * | 2013-12-09 | 2015-06-11 | Geo Semiconductor Inc. | System and method for automated test-pattern-free projection calibration |
WO2017075501A1 (en) * | 2015-10-30 | 2017-05-04 | Essential Products, Inc. | An imaging device and method for generating an undistorted wide view image |
US9813623B2 (en) | 2015-10-30 | 2017-11-07 | Essential Products, Inc. | Wide field of view camera for integration with a mobile device |
US9843725B2 (en) | 2015-12-29 | 2017-12-12 | VideoStitch Inc. | Omnidirectional camera with multiple processors and/or multiple sensors connected to each processor |
US9906721B2 (en) | 2015-10-30 | 2018-02-27 | Essential Products, Inc. | Apparatus and method to record a 360 degree image |
US10021301B2 (en) | 2015-12-29 | 2018-07-10 | VideoStitch Inc. | Omnidirectional camera with multiple processors and/or multiple sensors connected to each processor |
CN109618090A (en) * | 2017-10-04 | 2019-04-12 | 英特尔公司 | To the method and system of the image distortion correction by using wide-angle lens captured image |
US20190116316A1 (en) * | 2014-12-24 | 2019-04-18 | Agamemnon Varonos | Panoramic windshield viewer system |
US10400929B2 (en) | 2017-09-27 | 2019-09-03 | Quick Fitting, Inc. | Fitting device, arrangement and method |
US10771774B1 (en) * | 2019-03-22 | 2020-09-08 | Varjo Technologies Oy | Display apparatus and method of producing images having spatially-variable angular resolutions |
US10969047B1 (en) | 2020-01-29 | 2021-04-06 | Quick Fitting Holding Company, Llc | Electrical conduit fitting and assembly |
US11035510B1 (en) | 2020-01-31 | 2021-06-15 | Quick Fitting Holding Company, Llc | Electrical conduit fitting and assembly |
Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3953111A (en) | 1974-11-04 | 1976-04-27 | Mcdonnell Douglas Corporation | Non-linear lens |
US4728839A (en) | 1987-02-24 | 1988-03-01 | Remote Technology Corporation | Motorized pan/tilt head for remote control |
US4751660A (en) | 1985-07-09 | 1988-06-14 | Sony Corporation | Determining orientation of transformed image |
US4754269A (en) | 1984-03-05 | 1988-06-28 | Fanuc Ltd | Graphic display method for displaying a perspective view of an object on a CRT |
US4772942A (en) | 1986-01-11 | 1988-09-20 | Pilkington P.E. Limited | Display system having wide field of view |
GB2221118A (en) | 1988-06-21 | 1990-01-24 | Sony Corp | Image transformation apparatus |
US5023725A (en) | 1989-10-23 | 1991-06-11 | Mccutchen David | Method and apparatus for dodecahedral imaging system |
US5048102A (en) | 1987-09-16 | 1991-09-10 | Commissariat A L'energie Atomique | Multiple interpolation process for image correction |
US5067019A (en) | 1989-03-31 | 1991-11-19 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Programmable remapper for image processing |
WO1992021208A1 (en) | 1991-05-13 | 1992-11-26 | Telerobotics International, Inc. | Omniview motionless camera orientation system |
US5173948A (en) | 1991-03-29 | 1992-12-22 | The Grass Valley Group, Inc. | Video image mapping system |
US5175808A (en) | 1989-09-12 | 1992-12-29 | Pixar | Method and apparatus for non-affine image warping |
US5384588A (en) | 1991-05-13 | 1995-01-24 | Telerobotics International, Inc. | System for omindirectional image viewing at a remote location without the transmission of control signals to select viewing parameters |
US5422987A (en) | 1991-08-20 | 1995-06-06 | Fujitsu Limited | Method and apparatus for changing the perspective view of a three-dimensional object image displayed on a display screen |
US5796426A (en) | 1994-05-27 | 1998-08-18 | Warp, Ltd. | Wide-angle image dewarping method and apparatus |
EP0610863B1 (en) | 1993-02-08 | 2001-11-14 | Interactive Pictures Corporation | Omniview motionless camera surveillance system |
US7873233B2 (en) * | 2006-10-17 | 2011-01-18 | Seiko Epson Corporation | Method and apparatus for rendering an image impinging upon a non-planar surface |
-
2008
- 2008-05-09 US US12/118,570 patent/USRE43490E1/en not_active Expired - Fee Related
Patent Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3953111A (en) | 1974-11-04 | 1976-04-27 | Mcdonnell Douglas Corporation | Non-linear lens |
US4754269A (en) | 1984-03-05 | 1988-06-28 | Fanuc Ltd | Graphic display method for displaying a perspective view of an object on a CRT |
US4751660A (en) | 1985-07-09 | 1988-06-14 | Sony Corporation | Determining orientation of transformed image |
US4772942A (en) | 1986-01-11 | 1988-09-20 | Pilkington P.E. Limited | Display system having wide field of view |
US4728839A (en) | 1987-02-24 | 1988-03-01 | Remote Technology Corporation | Motorized pan/tilt head for remote control |
US5048102A (en) | 1987-09-16 | 1991-09-10 | Commissariat A L'energie Atomique | Multiple interpolation process for image correction |
GB2221118A (en) | 1988-06-21 | 1990-01-24 | Sony Corp | Image transformation apparatus |
US5067019A (en) | 1989-03-31 | 1991-11-19 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Programmable remapper for image processing |
US5175808A (en) | 1989-09-12 | 1992-12-29 | Pixar | Method and apparatus for non-affine image warping |
US5023725A (en) | 1989-10-23 | 1991-06-11 | Mccutchen David | Method and apparatus for dodecahedral imaging system |
US5173948A (en) | 1991-03-29 | 1992-12-22 | The Grass Valley Group, Inc. | Video image mapping system |
WO1992021208A1 (en) | 1991-05-13 | 1992-11-26 | Telerobotics International, Inc. | Omniview motionless camera orientation system |
US5185667A (en) | 1991-05-13 | 1993-02-09 | Telerobotics International, Inc. | Omniview motionless camera orientation system |
US5384588A (en) | 1991-05-13 | 1995-01-24 | Telerobotics International, Inc. | System for omindirectional image viewing at a remote location without the transmission of control signals to select viewing parameters |
US5877801A (en) | 1991-05-13 | 1999-03-02 | Interactive Pictures Corporation | System for omnidirectional image viewing at a remote location without the transmission of control signals to select viewing parameters |
US5422987A (en) | 1991-08-20 | 1995-06-06 | Fujitsu Limited | Method and apparatus for changing the perspective view of a three-dimensional object image displayed on a display screen |
EP0610863B1 (en) | 1993-02-08 | 2001-11-14 | Interactive Pictures Corporation | Omniview motionless camera surveillance system |
US5796426A (en) | 1994-05-27 | 1998-08-18 | Warp, Ltd. | Wide-angle image dewarping method and apparatus |
US6005611A (en) | 1994-05-27 | 1999-12-21 | Be Here Corporation | Wide-angle image dewarping method and apparatus |
US6346967B1 (en) | 1994-05-27 | 2002-02-12 | Be Here Corporation | Method apparatus and computer program products for performing perspective corrections to a distorted image |
US7873233B2 (en) * | 2006-10-17 | 2011-01-18 | Seiko Epson Corporation | Method and apparatus for rendering an image impinging upon a non-planar surface |
Non-Patent Citations (2)
Title |
---|
Rebiai et al. "Image Distortion from Zoom Lenses: modeling and digital correction," IBC 1992, pp. 438-441, IEE London, UK, Jul. 1992. |
Rebiai et al., "Image Distortion from Zoom Lenses: modeling and digital correction," IBC 1992, pp. 438-441, IEE London, UK, Jul. 1992. |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100033551A1 (en) * | 2008-08-08 | 2010-02-11 | Adobe Systems Incorporated | Content-Aware Wide-Angle Images |
US8525871B2 (en) * | 2008-08-08 | 2013-09-03 | Adobe Systems Incorporated | Content-aware wide-angle images |
US9742994B2 (en) | 2008-08-08 | 2017-08-22 | Adobe Systems Incorporated | Content-aware wide-angle images |
US20140314336A1 (en) * | 2011-12-19 | 2014-10-23 | Dai Nippon Printing Co., Ltd. | Image processing device, image processing method, program for image processing device, recording medium, and image display device |
US9269124B2 (en) * | 2011-12-19 | 2016-02-23 | Dai Nippon Printing Co., Ltd. | Image processing device, image processing method, program for image processing device, recording medium, and image display device |
US20150160539A1 (en) * | 2013-12-09 | 2015-06-11 | Geo Semiconductor Inc. | System and method for automated test-pattern-free projection calibration |
US10901309B2 (en) * | 2013-12-09 | 2021-01-26 | Geo Semiconductor Inc. | System and method for automated test-pattern-free projection calibration |
US20180196336A1 (en) * | 2013-12-09 | 2018-07-12 | Geo Semiconductor Inc. | System and method for automated test-pattern-free projection calibration |
US9915857B2 (en) * | 2013-12-09 | 2018-03-13 | Geo Semiconductor Inc. | System and method for automated test-pattern-free projection calibration |
US20190116316A1 (en) * | 2014-12-24 | 2019-04-18 | Agamemnon Varonos | Panoramic windshield viewer system |
US9906721B2 (en) | 2015-10-30 | 2018-02-27 | Essential Products, Inc. | Apparatus and method to record a 360 degree image |
US9819865B2 (en) | 2015-10-30 | 2017-11-14 | Essential Products, Inc. | Imaging device and method for generating an undistorted wide view image |
US10218904B2 (en) | 2015-10-30 | 2019-02-26 | Essential Products, Inc. | Wide field of view camera for integration with a mobile device |
US9813623B2 (en) | 2015-10-30 | 2017-11-07 | Essential Products, Inc. | Wide field of view camera for integration with a mobile device |
WO2017075501A1 (en) * | 2015-10-30 | 2017-05-04 | Essential Products, Inc. | An imaging device and method for generating an undistorted wide view image |
US9843725B2 (en) | 2015-12-29 | 2017-12-12 | VideoStitch Inc. | Omnidirectional camera with multiple processors and/or multiple sensors connected to each processor |
US10021301B2 (en) | 2015-12-29 | 2018-07-10 | VideoStitch Inc. | Omnidirectional camera with multiple processors and/or multiple sensors connected to each processor |
US10400929B2 (en) | 2017-09-27 | 2019-09-03 | Quick Fitting, Inc. | Fitting device, arrangement and method |
CN109618090A (en) * | 2017-10-04 | 2019-04-12 | 英特尔公司 | To the method and system of the image distortion correction by using wide-angle lens captured image |
US10771774B1 (en) * | 2019-03-22 | 2020-09-08 | Varjo Technologies Oy | Display apparatus and method of producing images having spatially-variable angular resolutions |
US10969047B1 (en) | 2020-01-29 | 2021-04-06 | Quick Fitting Holding Company, Llc | Electrical conduit fitting and assembly |
US11035510B1 (en) | 2020-01-31 | 2021-06-15 | Quick Fitting Holding Company, Llc | Electrical conduit fitting and assembly |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7042497B2 (en) | Wide-angle dewarping method and apparatus | |
USRE43490E1 (en) | Wide-angle dewarping method and apparatus | |
US6870532B2 (en) | Image display | |
US7426317B2 (en) | Image processing apparatus and image processing method, storage medium and computer program | |
US5963215A (en) | Three-dimensional browsing of multiple video sources | |
US6757446B1 (en) | System and process for image-based relativistic rendering | |
EP0998727B1 (en) | Texture mapping in 3-d computer graphics | |
US6243099B1 (en) | Method for interactive viewing full-surround image data and apparatus therefor | |
CN110648274B (en) | Method and device for generating fisheye image | |
CA2995665C (en) | Image generating apparatus and image display control apparatus for a panoramic image | |
CN109997167B (en) | Directional image stitching for spherical image content | |
IL172886A (en) | Panoramic video system with real-time distortion-free imaging | |
WO2017086244A1 (en) | Image processing device, information processing device, and image processing method | |
Nielsen | Surround video: a multihead camera approach | |
US20100033480A1 (en) | Method for Interactively Viewing Full-Surround Image Data and Apparatus Therefor | |
JP3352475B2 (en) | Image display device | |
JP4406824B2 (en) | Image display device, pixel data acquisition method, and program for executing the method | |
WO2009068942A1 (en) | Method and system for processing of images | |
CN110675482A (en) | Spherical Fibonacci pixel dot matrix panoramic picture rendering and displaying method for virtual three-dimensional scene | |
US11120606B1 (en) | Systems and methods for image texture uniformization for multiview object capture | |
JPH07210705A (en) | Virtual reality device | |
Nielsen | High resolution full spherical videos | |
CN117931120B (en) | Camera image visual angle adjusting method based on GPU | |
Byun et al. | Air: Anywhere immersive reality with user-perspective projection | |
Der et al. | Interactive viewing of panoramic images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: B.H. IMAGE CO. LLC, DELAWARE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BE HERE CORPORATION;REEL/FRAME:023535/0040 Effective date: 20071117 |
|
CC | Certificate of correction | ||
FPAY | Fee payment |
Year of fee payment: 8 |
|
AS | Assignment |
Owner name: CHARTOLEAUX KG LIMITED LIABILITY COMPANY, DELAWARE Free format text: MERGER;ASSIGNOR:B.H. IMAGE CO. LLC;REEL/FRAME:037096/0897 Effective date: 20150812 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.) |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.) |