EP1287684A4 - METHOD AND SYSTEM FOR IMAGE FUSION - Google Patents
METHOD AND SYSTEM FOR IMAGE FUSIONInfo
- Publication number
- EP1287684A4 EP1287684A4 EP01927025A EP01927025A EP1287684A4 EP 1287684 A4 EP1287684 A4 EP 1287684A4 EP 01927025 A EP01927025 A EP 01927025A EP 01927025 A EP01927025 A EP 01927025A EP 1287684 A4 EP1287684 A4 EP 1287684A4
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- fused image
- array
- sampling
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 230000004927 fusion Effects 0.000 title claims abstract description 28
- 238000000034 method Methods 0.000 title claims description 37
- 238000005070 sampling Methods 0.000 claims description 35
- 238000013459 approach Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000007792 addition Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2624—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of whole input images, e.g. splitscreen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/11—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
Definitions
- This invention relates generally to the field of electro-optical systems and more specifically to a method and system for fusing images.
- Image fusion involves combining two or more images produced by two or more image sensors into one single image. Producing one image that mitigates the weak aspects of the individual images while retaining the strong ones is a complicated task, often requiring a mainframe computer. Known approaches to image fusion have not been able to produce a small, lightweight system that consumes minimal power.
- processor speed ⁇ p is limited to about 150 MHz.
- a maximum of approximately two instructions per cycle is allowed in current microprocessors and digital signal processors.
- a system for fusing images comprises two or more sensors for generating two or more sets of image data.
- An information processor receives and samples the sets of image data to generate sample data and computes a fused image array from the sample data.
- a display receives the fused image array and displays a fused image generated from the fused image array.
- a four-step method for fusing images is disclosed. Step one calls for receiving sets of image data generated by sensors. Step two provides for sampling the sets of image data to produce sample data. In step three, the method provides for computing a fused image array from the sample data.
- the last step calls for displaying a fused image generated from the fused image array.
- a four-step method for computing a fused image array is disclosed. Step one calls for sampling sets of image data generated from sensors to produce sample data. Step two provides for determining image fusion metrics from the sample data. Step three calls for calculating weighting factors from the image fusion metrics. The last step provides for computing a fused image array from the weighting factors, wherein the fused image array is used to generate the fused image.
- a technical advantage of the present invention is that it computes the fused image from sampled sensor data. By sampling the sensor data, the invention reduces the number of instruction cycles required to compute a fused image. Reducing the number of instruction cycles allows for smaller, lightweight image fusion systems that consume minimal power. BRIEF DESCRIPTION OF THE DRAWINGS
- FIGURE 1 is a system block diagram of one embodiment of the present invention
- FIGURE 2 is a flowchart demonstrating one method of fusing images in accordance with the present invention
- FIGURE 3 A illustrates sampling with a fixed array pattern in accordance with the present invention
- FIGURE 3B illustrates sampling with a varied array pattern in accordance with the present invention
- FIGURE 3C illustrates sampling randomly in accordance with the present invention
- FIGURE 4 illustrates a method of computing weighting factors in accordance with the present invention.
- FIGURE 1 is a system block diagram of one embodiment of the present invention.
- a sensor A 102 and a sensor B 104 detect one or more physical objects 106 in order to generate image data to send to an information processor 108, which fuses the sets of image data to produce a single fused image to be displayed by a display 110.
- the sensor A 102 detects the physical objects 106 and generates sensor data, which is sent to an amplifier A 112.
- Amplifier 1 12 amplifies the sensor data and then sends it to an analog-to-digital converter A l ⁇ 4.
- the analog- to-digital converter y-1 1 14 converts the analog sensor data to digital data, and sends the data to an image buffer A 116 to store the data.
- the sensor B operates in a similar fashion.
- the sensor B 104 detects the physical objects 106 and sends the data to amplifier B 118.
- the amplifier B 118 sends amplified data to an analog-to-digital converter B 120, which sends converted data to an image buffer B 122.
- a field programmable gate array 124 receives the data generated by the sensor 102 and the sensor B 104.
- the information processor 108 receives the data from the field programmable gate array 124.
- the information processor 108 generates a fused image from the sets of sensor data, and uses an information processor buffer 126 to store data while generating the fused image.
- the information processor 108 sends the fused image data to a display buffer 128, which stores the data until it is to be displayed on the display 110.
- FIGURE 2 is a flowchart demonstrating one method of image fusion in accordance with the present invention. The following steps may be performed automatically using an information processor 108.
- the method begins with step 202, where two or more image sensors generate two or more sets of image data. As above, suppose that there are two image sensors, each with the same pixel arrangement. Let
- N h be the number of horizontal pixels
- N v be the number of vertical pixels, such that the total number of pixels per sensor is N/, • N v .
- the sensors may comprise, for example, visible light or infrared light image detectors. Assume that detectable variations in the proportion of the fused image computed from one set of image data and from the other set of image data occur in time ⁇ s , where: ⁇ s > ⁇ I ⁇ d . Hence, the computation of the proportion does not need to be calculated at each frame. Also, assume that the information required to form a metric that adjusts the system to a given wavelength ⁇ proportion can be derived from fewer than N / -»N V pixels.
- FIGURES 3A, 3B, and 3C illustrate three methods of sampling image data in accordance with the present invention.
- FIGURE 3A illustrates sampling with a fixed array pattern.
- the sampled pixels (/, j) 302, 304, 306, 308, 310, 312, 314, 316, and 318 may be described by:
- ⁇ / , 2 for the horizontal difference between one sampled pixel to the next sampled pixel
- ⁇ v 2 for the vertical difference between one sampled pixel to the next sampled pixel.
- FIGURE 3B illustrates sampling with a varied array pattern.
- FIGURE 3C illustrates random sampling. A sequence of sampling patterns may also be used, repeating at any given number of sampling cycles, or never repeating, as in a random pattern for each continued sampling cycle.
- a fused image array is computed from the sample data.
- image fusion metrics are calculated from the sample data.
- the image fusion metrics are values assigned to the pixels of the sample data. These values, for example, may give the relative weight of the data from each sensor, such that the data from the sensor that produces the better image is given more weight. Or, these values may be used to provide a control for the production of, for example, a false color image. All the pixels may be assigned the same metric, ⁇ , or each sample pixel may assigned its own metric, ⁇ y , where the subscript if designates the pixel in the rth row andy ' th column.
- weighting factors ⁇ are calculated from the image fusion metrics.
- the weighting factors are values assigned to the pixels of the fused image.
- the weighting factors may be computed by, for example, linear interpolation of the image fusion metrics.
- FIGURE 4 illustrates a method of computing weighting factors in accordance with the present invention.
- a sampling block 410 comprises to two sampled points 402 and 404.
- the weighting factors ⁇ 0 of the first row may be computed in the following manner.
- step 210 a fused image array, which is used to generate a fused image, is computed from the weighting factors.
- An array of weighting factors ⁇ generates the following fused image array:
- V, V l / A) -a lJ + V l / B Hl - a v )
- the superscripts (d) denotes display
- A) denotes sensor A
- (B) denotes sensor B
- V ⁇ corresponds to the voltage at pixel (zJ).
- the fused image array describes the relative weights of the data from each sensor.
- Weighting factor ⁇ gives the relative weight of the voltage from sensor A at pixel (/J);
- weighting factor (l- y ) gives the relative weight of the voltage from sensor B at pixel (z ' ). This example shows a linear weight; other schemes, however, can be used.
- the method then proceeds to step 212, where the fused image generated from the fused image array is displayed on a display 1 10.
- this embodiment allows for more instruction cycles to calculate ⁇ y for each sampled pixel.
- To calculate the number of instruction cycles available for each sampled pixel first calculate the total number of instruction cycles per sampled pixel, and then subtract number of cycles per pixel needed to sample the pixels and to compute the fused image metrics and the fused image array. For example, assume that data is sampled using fixed array sampling. The total number of instructions for each sampled pixel is given by:
- n beaut and n v are the number of sampled pixels in the horizontal direction and in the vertical direction, respectively. Sampling each sampling block, without double counting borders, requires about ( ⁇ + 1)[2( ⁇ - 1) + 6] instruction cycles. Each sampling block contains two sampled pixels, so each sampled pixel loses 1/2( ⁇ +
- N, 269 instruction cycles. This is a dramatic improvement compared with the 32 cycles allotted in conventional methods. The extra cycles may be used for more complex calculations of ⁇ y or other features. Moreover, if ⁇ y is assumed to be the same for all pixels, even more additional cycles may be available to determine ⁇ y , allowing for a more sophisticated manipulation.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
Abstract
Description
Claims
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US56126000A | 2000-04-27 | 2000-04-27 | |
US561260 | 2000-04-27 | ||
PCT/US2001/012260 WO2001084828A1 (en) | 2000-04-27 | 2001-04-13 | Method and system for fusing images |
Publications (2)
Publication Number | Publication Date |
---|---|
EP1287684A1 EP1287684A1 (en) | 2003-03-05 |
EP1287684A4 true EP1287684A4 (en) | 2006-07-12 |
Family
ID=24241253
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP01927025A Withdrawn EP1287684A4 (en) | 2000-04-27 | 2001-04-13 | METHOD AND SYSTEM FOR IMAGE FUSION |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP1287684A4 (en) |
AU (1) | AU2001253518A1 (en) |
CA (1) | CA2404654A1 (en) |
IL (1) | IL151574A0 (en) |
WO (1) | WO2001084828A1 (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7176963B2 (en) * | 2003-01-03 | 2007-02-13 | Litton Systems, Inc. | Method and system for real-time image fusion |
CN1303432C (en) * | 2003-06-05 | 2007-03-07 | 上海交通大学 | Remote sensing image picture element and characteristic combination optimizing mixing method |
CN1313972C (en) * | 2003-07-24 | 2007-05-02 | 上海交通大学 | Image merging method based on filter group |
US7091930B2 (en) * | 2003-08-02 | 2006-08-15 | Litton Systems, Inc. | Centerline mounted sensor fusion device |
US7373023B2 (en) | 2003-11-12 | 2008-05-13 | Northrop Grumman Guidance And Electronics Company, Inc. | Method and system for generating an image |
CN100410684C (en) * | 2006-02-23 | 2008-08-13 | 复旦大学 | Remote Sensing Image Fusion Method Based on Bayesian Linear Estimation |
US9053558B2 (en) | 2013-07-26 | 2015-06-09 | Rui Shen | Method and system for fusing multiple images |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5140416A (en) * | 1990-09-18 | 1992-08-18 | Texas Instruments Incorporated | System and method for fusing video imagery from multiple sources in real time |
US5416851A (en) * | 1991-07-30 | 1995-05-16 | Xerox Corporation | Image analysis based on location sampling |
US5488674A (en) * | 1992-05-15 | 1996-01-30 | David Sarnoff Research Center, Inc. | Method for fusing images and apparatus therefor |
DE19502640C1 (en) * | 1995-01-20 | 1996-07-11 | Daimler Benz Ag | Fusing images of same view from different sensors for night sight |
US5550937A (en) * | 1992-11-23 | 1996-08-27 | Harris Corporation | Mechanism for registering digital images obtained from multiple sensors having diverse image collection geometries |
WO1998059320A1 (en) * | 1997-06-21 | 1998-12-30 | Raytheon Company | System and method for local area image processing |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5159455A (en) * | 1990-03-05 | 1992-10-27 | General Imaging Corporation | Multisensor high-resolution camera |
KR0148695B1 (en) * | 1992-08-08 | 1998-09-15 | 강진구 | Video camera high definition device |
JP3058774B2 (en) * | 1993-01-29 | 2000-07-04 | 株式会社河合楽器製作所 | Image synthesizing apparatus and image synthesizing method |
US5889553A (en) * | 1993-11-17 | 1999-03-30 | Canon Kabushiki Kaisha | Image pickup apparatus capable of high resolution imaging |
JPH08214201A (en) * | 1994-11-28 | 1996-08-20 | Canon Inc | Image pickup device |
-
2001
- 2001-04-13 EP EP01927025A patent/EP1287684A4/en not_active Withdrawn
- 2001-04-13 AU AU2001253518A patent/AU2001253518A1/en not_active Abandoned
- 2001-04-13 IL IL15157401A patent/IL151574A0/en unknown
- 2001-04-13 WO PCT/US2001/012260 patent/WO2001084828A1/en active Application Filing
- 2001-04-13 CA CA002404654A patent/CA2404654A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5140416A (en) * | 1990-09-18 | 1992-08-18 | Texas Instruments Incorporated | System and method for fusing video imagery from multiple sources in real time |
US5416851A (en) * | 1991-07-30 | 1995-05-16 | Xerox Corporation | Image analysis based on location sampling |
US5488674A (en) * | 1992-05-15 | 1996-01-30 | David Sarnoff Research Center, Inc. | Method for fusing images and apparatus therefor |
US5550937A (en) * | 1992-11-23 | 1996-08-27 | Harris Corporation | Mechanism for registering digital images obtained from multiple sensors having diverse image collection geometries |
DE19502640C1 (en) * | 1995-01-20 | 1996-07-11 | Daimler Benz Ag | Fusing images of same view from different sensors for night sight |
WO1998059320A1 (en) * | 1997-06-21 | 1998-12-30 | Raytheon Company | System and method for local area image processing |
Non-Patent Citations (1)
Title |
---|
See also references of WO0184828A1 * |
Also Published As
Publication number | Publication date |
---|---|
IL151574A0 (en) | 2003-04-10 |
CA2404654A1 (en) | 2001-11-08 |
EP1287684A1 (en) | 2003-03-05 |
AU2001253518A1 (en) | 2001-11-12 |
WO2001084828A1 (en) | 2001-11-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US4133004A (en) | Video correlation tracker | |
US6584219B1 (en) | 2D/3D image conversion system | |
US20020012459A1 (en) | Method and apparatus for detecting stereo disparity in sequential parallel processing mode | |
EP1072147A2 (en) | Method and apparatus for measuring similarity using matching pixel count | |
JPH1040392A (en) | Local area image tracking device | |
JPS60229594A (en) | Method and device for motion interpolation of motion picture signal | |
JP2947360B2 (en) | Method and apparatus for measuring image motion | |
EP0182186A2 (en) | Back projection image reconstruction apparatus and method | |
EP1287684A4 (en) | METHOD AND SYSTEM FOR IMAGE FUSION | |
US10825160B2 (en) | Spatially dynamic fusion of images of different qualities | |
JP3161467B2 (en) | Method for temporal interpolation of images and apparatus for implementing this method | |
JP2001154646A (en) | Infrared image display | |
JP3786300B2 (en) | Motion vector detection apparatus and motion vector detection method | |
EP1830562A1 (en) | Learning device, learning method, and learning program | |
JPH0364279A (en) | Image blurring detecting device | |
US6873395B2 (en) | Motion picture analyzing system | |
JP4053282B2 (en) | Image processing apparatus and image processing method | |
JP3038935B2 (en) | Motion detection device | |
CN117409043B (en) | Sub-pixel level video target tracking method, device, equipment and storage medium | |
JPH06105603B2 (en) | Image correction device for scanning electron microscope | |
JPH11175725A (en) | Frame processing type stereo image processing device | |
Liu | Simultaneous Monocular Visual Odometry and Depth Reconstruction with Scale Recovery | |
Åström et al. | Statistical approach to time-to-impact estimation suitable for real-time near-sensor implementation | |
JP4451161B2 (en) | Obstacle recognition assist device | |
KR970011380B1 (en) | 2-d motion vector estimation apparatus for hdtv |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20021001 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR |
|
AX | Request for extension of the european patent |
Extension state: AL LT LV MK RO SI |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AT BE FR GB |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: 8566 |
|
A4 | Supplementary search report drawn up and despatched |
Effective date: 20060609 |
|
17Q | First examination report despatched |
Effective date: 20070426 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20120605 |