GB2474557A - Vehicle movement detection using visible and invisible light - Google Patents
Vehicle movement detection using visible and invisible light Download PDFInfo
- Publication number
- GB2474557A GB2474557A GB1016975A GB201016975A GB2474557A GB 2474557 A GB2474557 A GB 2474557A GB 1016975 A GB1016975 A GB 1016975A GB 201016975 A GB201016975 A GB 201016975A GB 2474557 A GB2474557 A GB 2474557A
- Authority
- GB
- United Kingdom
- Prior art keywords
- imaging
- image
- imaging sensor
- area
- optical image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000001514 detection method Methods 0.000 title description 5
- 238000003384 imaging method Methods 0.000 claims abstract description 234
- 230000003287 optical effect Effects 0.000 claims abstract description 97
- 238000012544 monitoring process Methods 0.000 claims abstract description 5
- 230000005484 gravity Effects 0.000 claims description 9
- 230000000903 blocking effect Effects 0.000 abstract description 22
- 238000000034 method Methods 0.000 description 79
- 238000003333 near-infrared imaging Methods 0.000 description 21
- 238000010276 construction Methods 0.000 description 7
- 238000012545 processing Methods 0.000 description 6
- 238000013459 approach Methods 0.000 description 4
- 235000013399 edible fruits Nutrition 0.000 description 4
- 230000006866 deterioration Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000010287 polarization Effects 0.000 description 3
- 241000276457 Gadidae Species 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- VJYFKVYYMZPMAB-UHFFFAOYSA-N ethoprophos Chemical compound CCCSP(=O)(OCC)SCCC VJYFKVYYMZPMAB-UHFFFAOYSA-N 0.000 description 1
- YFBPRJGDJKVWAH-UHFFFAOYSA-N methiocarb Chemical compound CNC(=O)OC1=CC(C)=C(SC)C(C)=C1 YFBPRJGDJKVWAH-UHFFFAOYSA-N 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008521 reorganization Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/11—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/20—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
- H04N23/21—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only from near infrared [NIR] radiation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H04N5/2254—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Toxicology (AREA)
- Closed-Circuit Television Systems (AREA)
- Traffic Control Systems (AREA)
Abstract
An imaging system has a divided, shared imaging sensor 212 with first 218 and second 219 imaging areas, a color (infrared blocking) filter 215 provided on the first sensor area, and an invisible light (e.g. infrared) filter 213 on the second area. The sensor detects an optical image that moves in one direction on the first imaging and second imaging areas (see Figure 3). An imaging system controller determines whether captured optical images on either image detector area correspond to a predetermined condition, retrieves from a memory an image captured a certain period before the optical image corresponding to the predetermined condition, and stores the retrieved image. A vehicle monitoring system using the imaging system is positioned by a road and it is determined whether a vehicle passes through a predetermined area of the angle of view of the sensor; the sensor view angle can be virtually divided into plural, multiple areas, and repeated vehicle movement can be detected. Two sensors (512, 515: Figure 13) and a beam splitter (513) may be used to separate and capture visible and invisible (e.g. IR) light.
Description
IMAGER AND IMAGING SYSTEM FOR IMAGING MOBILE OBJECT
The present invention relates to an imaging system that photographs a mobile object and stores a photographed image.
Japanese Examined Patent Application Publication No. H02-38 997 discloses a vehicle-identification-number scanning system that scans a registration number of a moving vehicle. The vehicle-identification-number scanning system comprises a camera and a position sensor that are provided for each driving lane. When the position sensor detects a vehicle, an imaging instruction is sent to the camera provided above the driving lane that corresponds to the position sensor.
In such a construction, however, a camera must be provided for each driving lane, therefore, the camera cost is increased when multiple lanes are observed. In addition, when a camera is provided for each driving lane the movement of a vehicle cannot be properly recorded because the view angle of the camera is limited. Furthermore, in a construction in which one camera is provided for each driving lane, cameras cooperate with each other and are positioned in consideration of their respective view angles.
Therefore, the cost of setting up the cameras is increased.
Japanese Patent Application Publication No. 2001- 257923 discloses an imager that simultaneously captures multiple images with multiple optical systems. The imager comprises a telephoto lens, a wide-angle lens, and one imaging sensor. That is, the imager has two optical systems. An optical image passing through the telephoto lens is captured on the upper-half of the imaging area of the imaging sensor. An optical image passing through the wide-angle lens is captured on the lower-half of the imaging area of the imaging sensor. The imager simuiLtaneously captures a telephoto image and a wide-angle image.
However, in the imager having two optical systems, the view angle of each optical system is decided according to the distance from an object. Therefore, the imager must be designed according to its distance from an object. The predetermined design restriction limits the focusing area of the Lmager. Moreover, two optical systems increase the cost of the imager.
An object of the present invention is to provide an imaging system that records movement of a moving vehicle and captures a high-resolution image of the moving vehicle with one camera.
Another object of the present invention is to provide a simple imager that simultaneously captures multiple objects and has an optical system with multiple different configurations.
According to an aspect of the present invention there is provided an imaging system having an imaging sensor, a color filter, an invisible light filter, a memory and a controller. The imaging sensor has a first imaging area and a second imaging area that is different from the first imaging area, and that captures an optical image that moves on the first imaging area and the second imaging area. The color filter is provided on the first imaging area. The invisible light filter is provided on the second imaging area. The memory stores an image captured by the imaging sensor. The controller retrieves an image from the memory and stores the retrieved image in a storage medium, and determines whether or not the optical image corresponds to a predetermined condition based on an image that was captured on either the first imaging area or the second imaging area. The imaging sensor captures an optical image, which passes through the color filter, and outputs a color image; and captures an optical image, which passes through the invisible light filter, and outputs an invisible light image. In the case that the controller determines that the optical image corresponds to a predetermined condition, the controller retrieves from the memory an image captured a certain period before the optical image corresponding to the predetermined condition, and stores the retrieved image in the storage medium.
Certain examples of the present invention will now be described with reference to the accompanying drawings, of which:-Fig.1 is a block diagram of a first imagThg system according to a first embodiment; Fig.2 is a cutaway side view of part of the first imaging unit; Fig.3 shows an image captured by the first imaging unit; Fig.4 is a flowchart of a lane-change recording process; Fig.5 is a flowchart of a lane-change initialization process; Fig.6 is a flowchart of a roadside traffic recording process; Fig.7 is a flowchart of a traffic initialization process; Fig.8 is a flowchart of a vehicle color recording process; Fig.9 is a flowchart of a vehicle color initialization process; Fig.10 is a flowchart of a vehicle family recording process; Fig.11 is a flowchart of a vehicle family initialization process; Fig.12 is a block diagram of a second imaging system according to the second embodiment; Fig.13 is a cutaway side view of part of the second imaging unit; Fig.14 is a block diagram of a third imaging system according to the third embodiment; and Fig.15 is a cutaway side view of part of the third imaging unit.
The present invention is described below with references to the embodiments shown in the drawings. Figs 1-3 show a first imaging system according to the first embodiment of the present invention.
The first imaging system 100 comprises a first client 200 that is positioned for photographing an object, and a central controller 300 that is provided separately apart from the first client 200. The first client 200 comprises a first imager 210, a calculator 220, a vehicle detector 230, a near-infrared light source 240, a recorder 250, and a communicator 260.
The central controller 300 comprises a central communicator 310, an alarm 320, a central recorder 330, and a central calculator 340.
The first imager 210 comprises an imaging lens 211 that is an imaging optical system, a first CCID 212 that is an imaging sensor, a first visible-light blocking filter 213 that is an invisible light filter, a first polarizer 214, a first infrared blocking filter 215 that is a color filter, and a light path length adjustment filter 216.
It will be noted that invisible light relates to light having a wavelength that is not detectabiLe by the human eye, whereas visible light is light having a wavelength that is detectable by the human eye.
The imaging lens 211 has an aperture and focal length that provide deep depth of field so that the first imager 210 is capable of deep-focus shooting. That is, the focusing lens 211 can simultaneously focus on objects that exist anywhere from 10 meters to 100 meters from the first imager 210.
The first CCD 212 has multiple pixels in excess of one million pixels. A first imaging surface 217 of the first COD 212, which is rectangular, captures an optical image that passes through the imaging lens 211. The first CCD 212 is provided in the first client 200 and configured so that the lengthwise direction of the first imaging surface 217 corresponds to the direction of the force of gravity acting on the first client 200. The first imaging surface 217 is bisected into an upper-half and a lLower-half by a line perpendicular to the direction of gravity. The upper-half of the first imaging surface 217 is a near-infrared imaging area 218 and the lower-half is a visible light imaging area 219. An optical image is provided on both the near-infrared imaging area 218 and the visible imaging area 219. The first CCD 212 simultaneously captures an optical image provided on both the near-infrared imaging area 218 and the visible imaging area 219. The first CCID 212 repeatedly captures an optical image during a predetermined period, for example, ten times in one second, so that the CCD 212 outputs a 10 frame-per-second moving image to the calculator 220 and the recorder 250. The frame rate of capture may be properly determined as required.
The first polarizer 214 is an optical filter that transmits only light having a predetermined polarization state. The first visible-light blocking filter (the near-infrared filter) 213 does not transmit visible light; it is an invisible light filter and an optical filter that transmits only near-infrared light. The first polarizer 214 is attached to the first CCD 212 50 that it covers the entire surface of the near-infrared imaging area 218. The first visible-light blocking filter 213 is attached to a surface of the first polarizer 214, which is located near the imaging lens 211, so that it covers the entire surface of the first polarizer 214 and the near-infrared imaging area 218.
The first infrared blocking filter 215 is a color filter that does not transmit infrared light, but does transmit other light. The light path length-adjustment filter 216 is an optical filter that adjusts the path length of passing light and transmits light of every wavelength. The light path length-adjustment filter 216 is attached to the first CCD 212 so that it covers the entire surface of the visible light imaging area 219. The first infrared blocking filter 215 is attached to a surface of the light path length-adjustment filter 216, which is located near the imaging lens 211, SO that it covers the entire surface of the light path length-adjustment filter 216 and the visible light imaging area 219. A thickness index and refractive index of the light path length-adjustment filter 216 are configured in consideration of the thicknesses and refractive indexes of the first visible-light blocking filter 213, the first polarizer 214, and the first infrared blocking filter 215, so that the light path length obtained by adding the light path ILength of the first infrared blocking filter 215 and the light path length of the light path length-adjustment filter 216 is equal to the light path length obtained by adding the light path length of the first visible-light blocking filter 213 and the light path length of the first polarizer 214.
For example, the first imager 210 is provided above a driving lane of a highway for photographing a vehicle that approaches from a long distance and for monitoring the vehicle as it moves within the driving lane. The vertical direction of an optical image provided on the first imaging surface 217 is inverted by the photographing lens 211 so that it matches the vertical direction of an object.
Therefore, when an object such as a vehicle approaching the client 200 is located at a comparatively long distance, an optical image of the vehicle is captured in the visible light imaging area 219 in the lower-half of the first imaging surface 217. When a vehicle approaching the client is located at a comparatively close distance, an optical image of the vehicle appears in the near-infrared imaging area 218 in the upper-half of the first imaging surface 217. Thereby, the first client 200 photographs an object located at a comparatively long distance and outputs a color image, and photographs an object located at a comparatively short distance and outputs an infrared image (see Fig. 3) The vehicle detector 230 is a loop coil that is provided in the view angle of the first imager 210 and located under the road surface for the detection of a passing vehicle.
The near-infrared light source 240 is provided in a -10 -location such that it can illuminate a moving vehicle and an object whose optical image is provided on the near-infrared imaging area 218. Its location, for example, is on a free-flow gate above either a driving lane or the shoulder of the road.
The calculator 220 receives a vehicle detection signal from the vehicle detector 230 and a moving image from the first CCD 212, and then processes the image. The image processing carried out by the calculator 220 includes the lane-change recording process, the roadside traffic recording process, the vehicie color recording process, and the vehicle family recording process. The descriptions of these processes have been omitted because they are described in another embodiment. In these recording processes, the calculator 200 creates a still image of a driver and car registration plate that is captured on the near-infrared imaging area 218, and sends the still image to the recorder 250.
The recorder 250 is a memory that comprises a semiconductor memory device, such as a hard disk or a DRAM, and temporarily stores a moving image sent by the first imager 210 and a still image sent by the calculator 220.
The communicator 260 is connected to the central controller 300 by wire or radio signal. It determines whether or not still images and moving images stored in the -11 -recorder 250 correspond to a predetermined condition, and it sends corresponding images to the central controller 300.
The predetermined condition is described hereinafter.
The central communicator 310 receives a still image and a moving image, and sends them both to the central recorder 330. In addition, t sends an alarm signal to the alarm 320 when it receives a still or moving image from the communicator 260.
The central recorder 330 is a storage device such as a hard disk, memory, etc., that stores both still and moving images that it receives from the central communicator 310.
The alarm 320 comprises a speaker that emits sound, a display that generates an alarm image, or an indicator light that blinks when it receives an alarm signa] from the central communicator 310.
The central calculator 340 scans the car registration number on a car registration plate that is included in a still image, arid determines whether or not it corresponds to a predetermined monitoring number.
The lane-change recording process is described hereinafter with reference to Figs. 4 and 5.
The lane-change recording process is executed by the calculator 220 when the first client 200 begins photographing an image.
-12 -In Step S401, a lane-change initialization process is executed. The lane-change initialization process selects a driving lane area in an image photographed by the first imager 210, assigns a lane number to the selected driving lane area, then picks one frame from a moving image. The driving lane area of an image is an area occupied by a driving lane. The details of the lane-change initialization process are described hereinafter. Note that the driving lane is an area on a road that vehicles drive within that is bounded by white or yellow lane lines. Capturing is the process of picking one frame out of a moving image. Frame No. 0 is a frame captured in the lane-change initialization process.
In Step S402, a lane change parameter "m" is initialized with the value of zero. The lane change parameter m indicates the number of times an observed vehicle changes lanes.
In Step S403, a frame number parameter "n" is initialized with the value of one. The frame number parameter n indicates the frame number and is incrementally assigned to each frame of a moving image from a first frame to a last frame.
In Step S404, frames numbered n+1 and n+2 are captured from a moving image. Frame number n+2 is captured a predetermined period after frame number n+1 is captured.
-13 -For example, the predetermined period is 100 milliseconds.
In Step S405, "P" is the difference between frame number n+1 and frame number 0 or n, and "Q" is the difference between frame number n+2 and frame number n+1.
Therefore, the value of a pixel in a certain frame is picked up if it is different from the valLue of a corresponding pixel in a frame that follows the certain frame by a predetermined period. The pixel that has a different value corresponds to a mobile object, that is, an object that moves during the time between the capture of a certain frame and the capture of a different frame a predetermined period later.
In Step S406, the differences P and Q are binarized with a predetermined threshold value. Then, a logical multiplication R is calculated with the binarized differences P and Q. According to binarization with a predetermined threshold value, a pixel that does not represent a mobile object is deemed to be signal noise and is removed. According to the logical multiplication R calculated with the binarized differences P and Q, a mobile object included in a certan frame and in a frame that follows a predetermined period afterward is specified. This
specification process is a marking operation.
In Step S407, a lane number and a driving lane area in which the mobile object marked in Step S406 is located -14 -are calculated. This specifies the driving lane in which a mobile object is located.
In Step S408, a frame is captured from a moving image output by the first imager 210. Then, the frame number parameter n is incrementally increased by one for each subsequent captured frame. A total of three frames are captured by Step S408. That is, two frames are captured in Step S404, and the other one is captured in S408. Hereinafter, the latest captured frame is frame number ri+2, the frame captured immediately before frame number n+2 is frame number n+1, and the frame captured immediately before frame number n+1 is frame number n.
In Step S409, "S" is calculated as the difference between frame number n+1 and frame number n, and!YTU is the difference between frame number n+2 and frame number n+1.
Therefore, a mobile object is detected if it moves from the moment a certain frame is captured until a different frame is captured a predetermined time later.
In Step S410, the differences S and T are binarized with a predetermined threshold value. Then, a logical multiplication U is calculated with the binarized differences S and T. According to binarization with a predetermined threshold value, a pixel that does not represent a mobile object s deemed to be noise and is removed. According to the logical multiplication U -15 -calculated with the binarized differences S and I, a mobile object is marked if it is included in a certain frame and in another frame that follows a predetermined time later.
In Step S411, a lane number and a driving lane area in which the mobile object marked in Step S410 is located are calculated. This specifies the driving lane in which a mobile object is presently located.
In Step S412, it is determined whether the lane number calculated in Step S407 is different from the lane number calculated in Step S411. If they are different, the process proceeds to Step S413. In they are the same, the process proceeds to Step S414.
In Step S413, the lane change parameter m is incremented by one. Because when the lane number is different in Step S412, it indicates that a mobile object changes a driving lane.
In Step S414, it is determined whether or not the lane change parameter m is greater than or equal to a maximum lane change parameter mMAX. The maximum lane change parameter mMAX is a maximum value for the number of times a mobile object is allowed to change lanes, and is decided by a user before starting the lane-change recording process.
In the case that the lane change parameter m is greater than or equal to the maximum lane change parameter mMAX, the process proceeds to Step S415, otherwise it returns to -16 -Step 5408.
In Step S415, multiple frames are captured from a moving image output by the first imager 210. The capture interval and the total number captured are decided so as to record all of the lane changes of a mobile object. In the present embodiment, ten frames are captured every 0.1 second.
In Step S416, all of the frames captured in Step S415 are sent to the central controller 300 by passing through the communicator 260. In the central controller 300, the central recorder 330 stores all of the received frames. Then, the process ends.
Next, the lane-change initialization process is described below.
In Step S501 the driving lane area, which is an area occupied by a driving lane in an image photographed by the first imager 210, is determined. The driving lane area becomes noticeable when the calculator 200 generates a line to divide the driving lanes. The shoulder of the road is also recognized as a driving lane so that a vehicle can be detected when it runs onto the shoulder of the road.
In Step S502, a number is assigned to the selected driving lane area. In the present embodiment, as shown in Fig. 3, the shoulder on the right side of a driving lane with a vehicle photographed by near-infrared light is -17 -assigned lane number Li, the rightmost driving lane with a vehicle photographed by near-infrared light is assigned lane number L2, the driving lane located to the left of driving lane L2 is assigned lane number L3, and the leftmost driving lane without a vehicle is assigned lane number L4.
In Step S503, one frame is picked out of a moving image. A driving lane in which a mobile object does not exist is photographed in advance, so that a mobile object can be easily detected. Then, the process ends.
iO According to the lane-change recording process, a vehicle that changes lanes a number of times that is greater than or equal to the maximum lane change parameter mMAX is automatically photographed and recorded.
The roadside traffic recording process is described iS hereinafter with references to Figs. 6 and 7. The roadside traffic recording process is executed by the calculator 220 when the first client 200 is ready to begin photographing an image.
In Step S601, a traffic initialization process is executed. The traffic initialization process assigns a lane number to the driving lane area, sets the maximum roadside passing parameter mMAX, picks one frame out of a moving image, and then, recognizes the shoulder of a road. The details of the traffic initialization process are described hereinafter. In the present process, the maximum roadside -18 -passing parameter mMAX is the maximum number of times that a mobile object can pass onto the shoulder of a road; but more precisely, it is a maximum period and it is determined by a user. Frame No.0 is a frame captured in the traffic initialization process.
In Step S602, a finding parameter "m" is initialized with the value of zero. The finding parameter m is the number of times that an observed mobile object passes onto the shoulder of a road, but more precisely it is a maximum period.
In Step S603, the frame number parameter "n" is initialized with the value of one. The frame number parameter n is incrementally increased as it is assigned to each frame in a moving image, from the first frame to the last frame, and indicates a frame number.
In Step S604, frame numbers n+1 and n+2 are captured from a moving image. Frame number n+2 is captured a predetermined period after frame number n+1 is captured.
For example, the predetermined period is 100 milliseconds.
In Step S65, a difference "P" between frame number n+1 and frame number 0 or n, and a difference "Q" between frame number n+2 and frame number n+1 are measured.
Therefore, the value of a pixel in a certain frame that is different from the value of a pixel in a frame that follows a predetermined period after the certain frame is detected.
-19 -The pixel that has a different value corresponds to a mobile object, that is, an object that moves for a predetermined period from the moment that a certain frame is captured.
In Step S606, the differences P and Q are binarized with a predetermined threshold value. Then, a logical multiplication R is calculated with the binarized differences P and Q. According to binarization with a predetermined threshold value, a pixel that does not represent a mobile object s deemed to be noise and is removed. According to the logical multiplication R calculated with the binarized differences P and Q, a mobile object is specified if it is included in a certain frame and in a frame that follows a predetermined period afterward.
In Step S607, the frame number parameter "n" is incrementally increased by one.
In Step S608, a frame is captured from a moving image output by the first imager 210. After processing Step S608, three frames are captured. That is, two frames are captured in Step S604 and the other one is captured in S608. Hereinafter, the latest captured frame is frame number ri+2, the frame captured immediately before frame number ri+2 is frame number n+1, and the frame captured immediately before the frame number n+1 is frame number n.
-20 -In Step S609, "S" is assigned the difference between frame number n+1 and frame number n, and "1" is assigned the difference between frame number n+2 and frame number n+1. Therefore, a mobile object is an object that moves for a predetermined period from the moment when a certain frame is captured until a frame is picked out of a moving image.
In Step S610, the differences S and T are binarized with a predetermined threshold value. Then, a logical multiplication U is calculated with the binarized differences S and T. According to binarization with a predetermined threshold value, a pixel that does not represent a mobile object s deemed to be noise and is removed. According to the logical multiplication U calculated with the binarized differences S and T, a mobile object is marked if it is included in a certain frame and in a frame that follows a predetermined period later.
In Step S611, the frame number parameter "n" is incrementally increased by one.
In Step S612, whether or not a mobile object is on the shoulder of a road is determined. The determination is made based on whether or not a mobile object marked in Step S610 exists in the driving lane area of a frame that corresponds to the shoulder of a road. In the case that a mobile object exists on the shoulder of a road, the process -21 -proceeds to Step S613, otherwise it returns to Step S608.
In Step S613, the finding parameter m is incrementally increased by one because a mobile object was detected on the shoulder of the road in Step S612.
Whether the finding parameter m is greater than or equal to a maximum roadside passing parameter mMAX is determined in Step S614. A frame is captured over a predetermined period so that the time period iii which a mobile object is on the shoulder of a road can be determined by counting the number of frames in which the mobile object is photographed on the shoulder. In the case that the finding parameter m is larger than or equal to the maximum roadside passing parameter mMAX, the process proceeds to Step S615, otherwise it returns to Step S608.
Whether or not a mobile object is in a position to be photographed at a photographing point is determined in Step S615 based on a signal sent by the vehicle detector 230. The vehicle detector 230 comprises the loop coil and is provided at a photographThg point. A mobile object is determined to be located at a photographing point when the calculator 220 receives a signal from the loop coil. When a mobile object is at a photographing point the process proceeds to Step S616, otherwise it returns to Step S608.
In Step S616, multiple frames are captured from a moving image output by the first imager 210. The frame -22 -capture interval and the number of frames to be captured are decided so that all of the lane changes of a mobile object will be recorded. In the present embodiment, ten frames are captured every 0.1 second.
In Step S617, all o the frames captured in Step S616 are sent to the central controller 300 by passing through the communicator 260. Then, the process ends.
Next, the traffic initialization process is described below.
A driving lane area, which is an area occupied by a driving lane in an image photographed by the first imager 210, is selected in Step S701. The calculator 200 recognizes yellow and white lines that divide the driving lanes with an image processing. The shoulder of a road is recognized as one driving lane so that a vehicle can be detected when it is running on the shoulder of a road. A number is then assigned to the selected driving ILane area.
In the present embodiment, Lane number Li is assigned to the shoulder of the road located at the right of the driving lane containing a vehicle photographed by near-infrared light, lane number L2 is assigned to the rightmost driving lane that contains a vehicle photographed in near-infrared light, lane number L3 is assigned to the driving lane located to the left of driving lane L2, and lane number L4 is assigned to the leftmost driving lane (see -23 -Fig. 3) In Step S702, the maximum road side passing parameter mMAX is obtained.
In the next Step S703, one frame is selected from a moving image. An empty driving lane without a mobile object is photographed in advance so that a mobile object can be easily detected.
In Step S704, an area comprising the shoulder of the road is recognized as a roadside area. Then, the process ends.
According to the roadside traffic recording process, a vehicle that changes lanes a number of times that is greater than or equal to the maximum lane change parameter mMAX is automatically photographed and recorded.
The vehicle color recording process is described hereinafter with references to Figs. 8 and 9. The vehicle color recording process is executed by the calculator 220 when the first client 200 begins photographing an image.
In Step S801, a vehicle color initialization process is executed. The vehicle color initialization process establishes a vehicle color to be detected. The detail of the vehicle color initialization process is described hereinafter. The vehicle color to be detected is set by a user.
In Step S802, a frame number parameter "n" is -24 -initialized with the value of 0. The frame number parameter n, which indicates the frame number, is incrementally increased by one from the first frame to the last frame as it is assigned to each frame of a moving image.
In Step S803, frames numbered n+1 and n+2 are captured from a moving image. The frame number n+2 is captured a predetermined period after the frame number n+1 is captured. For example, the predetermined period is 100 milliseconds.
In Step S804, "P" represents the difference between frame number n+1 and frame number n while "Q" is the difference between frame number n+2 and frame number n+1.
Therefore, the value of a pixel in a certain frame is detected if it is different from the value of a corresponding pixel in a frame that follows a predetermined period after the certain frane. The pixel with a different value corresponds to a mobile object that moves between the moment when it is captured in a certain frame and the moment when a frame is captured a predetermined period after the certain frame.
In Step S805, the differences P and Q are binarized with a predetermined threshold value. Then, a logical multiplication R is calculated with the binarized differences P and Q. According to binarization with a -25 -predetermined threshold value, a pixel that does not represent a mobile object s deemed to be noise and is removed. According to the logical multiplication R calculated with the binarized differences P and Q, a mobile object that is included in a certain frame and a frame that follows a predetermined period after the certain frame is specified.
In Step S806, frame number n+1 is added to the logical multiplication R, which was calculated based on the marked mobile object, in order to calculate a color reorganization image C that contains color information.
In Step S807, it is determined whether or not the color of a mobile object corresponds to a target color.
The target color is the color of a sought-after target vehicle. In the case that the color of a mobile object corresponds to the target color, the process proceeds to Step S808, otherwise the process returns to Step S802.
In Step S808, a frame is captured from a moving image output by the first irnager 210. The frame captured in Step S808 is a color image. Therefore, a color image is stored in the recorder 250, so that it can recognize the color of a mobile object.
In Step S809, the frame number parameter "n" is incrementally increased by one.
In Step S810, a frame is captured from a moving -26 -image output by the first imager 210. After processing Step S810, four frames are captured. That is, three frames are captured in Steps S803 and S808, and the fourth frame is captured in S810. The process described below is carried out with the three most recently captured frames.
Hereinafter, the latest captured frame is frame number n+2, the frame captured immediately before frame number n+2 is frame number n+1, and the frame captured immediately before the frame number n+1 is frame number n.
In Step S811, "5" represents the difference between frame number n+1 and frame number n, and "T" is the difference between frame number n+2 and frame number n+1.
Therefore, a mobile object is selected if it moves from the moment when a certain frame is captured to the moment when a different frame is captured a predetermined period after the certain frame.
In Step S812, the differences S and T are binarized with a predetermined threshold value. A logical multiplication U is then calculated with the binarized differences S and T. According to binarization with a predetermined threshold value, a pixel that does not represent a mobile object s deemed to be noise and is removed. According to the logical multiplication U calculated with the binarized differences S and T, a mobile object is marked if it is included in a certain frame and -27 -in a frame that follows a predetermined period after the certain frame.
Whether or not a mobile object is located in a photographing point is determined in Step S813 based on a signal sent by the vehicle detector 230. The vehicle detector 230, which comprises a loop coil, is positioned at a photographing point. A mobile object is determined to be located at a photographing point when the calculator 220 receives a signal from the loop coil. When a mobile object is detected at a photographing point the process proceeds to Step S814, otherwise it returns to Step S809.
In Step S814, multiple frames are captured from a moving image output by the first imager 210. The frame capture interval and the number of frames to be captured are decided so that all of the lane changes of a mobile object will be recorded. In the present embodiment, ten frames are captured every 0.1 second.
In Step S815, all of the frames that were captured in Step S814 are sent to the central controller 300 by passing through the communicator 260. Then, the process ends.
Next, the vehicle color initialization process is described below.
In Step S901, a user sets a vehicle colLor to be detected. The vehicle color is set with HSV (hue, -28 -saturation, chroma, brightness, lightness, and value), or RGB (red, green, and blue), etc. After the color is set by a user, the process ends.
According to the vehicle color recording process, a vehicle having a targeted vehicle color is automatically photographed and recorded.
The vehicle family recording process is described hereinafter with references to Figs. 10 and 11. The vehicle family recording process is executed by the calculator 220 when the first client 200 begins photographing an image.
In Step SlOOl, a vehicle family initialization process is executed. The vehicle family initialization process sets a vehicle family and a maximum vehicle family parameter mMAX, and picks out one frame from a moving image. The details of the vehicle family initialization process are described hereinafter. The maximum vehicle family parameter mMAX is a number of frames used to capture a mobile object corresponding to a desired vehicle family (a target vehicle family), and is determined by a user.
Frame No. 0 is a frame captured in the vehicle family initialization process.
In Step S1002, a finding parameter "m" is initialized with the value of zero. The finding parameter m indicates the number of times that a mobile object -29 -corresponding to a target vehicle family is photographed.
In Step S1003, a frame number parameter "n" is initialized with the value of one. The frame number parameter n, which indicates the frame number, is incrementally increased by one from the first frame to the last frame as it is assigned to each frame in a moving image.
In Step S1004, frames numbered n+1 and n+2 are captured from a moving image. Frame number n+2 is captured a predetermined period after frame number n+1 is captured.
For example, the predetermined period is 100 milliseconds.
In Step S1005, "P" represents a difference between frame number n+1 and frame number n, and "Q" represents a difference between frame number n+2 and frame number n+1.
Therefore, the value of a pixel in a certain frame is picked up if it is different from the valLue of a corresponding pixel in a frame that follows a predetermined period after the certain frame. The pixel that has a different value corresponds to a mobile object, that is, an object that moves during the time between the capture of a certain frame and the capture of another frame a predetermined period later.
In Step S1006, the differences and Q are binarized with a predetermined threshold value. Then, a logical multiplication R is calculated with the binarized -30 -differences P and Q. According to binarization with a predetermined threshold value, a pixel that does not represent a mobile object s deemed to be noise and is removed. According to the logical multiplication R calculated with the binarized differences P and Q, a mobile object is specified if it is detected in a certain frame and in a frame a predetermined period after the certain frame.
In Step S1007, the frame number parameter "n" is incrementally increased by one.
In Step S1008, a frame is captured from a moving image output by the first imager 210. After processing Step S1008, three frames are captured. That is, two frames are captured in Step S1004 and the third frame is captured in S1008. Hereinafter, the latest captured frame is frame number ri+2, the frame captured immediately before frame number ri+2 is frame number n+1, and the frame captured immediately before frame number n+1 is frame number n.
In Step S1009, the frame number parameter "n" is incrementally increased by one.
In Step SlOb, "S" represents the difference between frame number n+1 and frame number n, and "1" represents the difference between frame number n+2 and frame number n+1.
Therefore, a mobile object s detected if it moves during the time between the capture of a certain frame and the -31 -capture of another frame a predetermined period Later.
In Step SlOll, the differences S and I are binarized with a predetermined threshold value. Then, a logical multiplication U is calculated with the binarized differences S and T. According to binarization with a predetermined threshold vaThe, a pixel that does not represent a mobile object is deemed to be signal noise and is removed. According to the logical multiplication U calculated with the binarized differences S and T a mobile object is marked if it is detected in a certain frame and in a frame that comes a predetermined period later.
In Step S1012, whether or not a mobile object corresponds to a target vehicle family is determined. The determination of whether or not the mobile object marked in Step SlOlO corresponds to a target vehicle family is based on an image from one frame. In the case that a mobile object corresponds to a target vehicle family, the process continues to Step S1013, otherwise it proceeds to Step S1014.
In Step S1013, the finding parameter m is incrementally increased by one when a movable object is determined to correspond to a target vehicle family in Step S1012, because a movable object is determined to corresponds to a target vehicle family in Step S1012.
Whether or not the finding parameter m is greater -32 -than or equal to the maximum vehicle family parameter mMAX is determined in Step S1014. After counting the number of frames in which a mobile object is captured, the mobile object is determined to correspond to a target vehicle family f the counted number of frames is greater than or equal to the maximum vehicle family parameter mMAX. This process increases the accuracy of detecting a target vehicle family. In the case that the finding parameter m is greater than or equal to the maximum vehicle family parameter mMAX, the process proceeds to Step S1015, otherwise it returns to Step S1008.
Whether or not a mobile object is at the location of a photographing point is determined in Step S1015 based on a signal sent by the vehicle detector 230. The vehicle detector 230 comprises the ILoop coil and is provided at a photographing point. A mobile object is determined to be at the location of a photographing point when the calculator 220 receives a signal from the loop coil. In the case that a mobile object is located at a photographing point, the process proceeds to Step S1016, otherwise it returns to Step S1008.
In Step S1016, multiple frames are captured from a moving image output by the first imager 210. The frame capture interval and the number of frames to be captured are decided so that all of the lane changes of a mobile -33 -object will be recorded. In the present embodiment, ten frames are captured every 0.1 second.
In Step S1017, all of the frames captured in Step S1016 are sent to the central controller 300 by passing through the communicator 260. Then, the process ends.
Next, the vehicle family initialization process is described below.
In Step S1101, a user sets a vehicle family to be detected.
In Step S1102, the maximum vehicle family parameter mMAX is obtained.
In Step S1103, one frame is selected from a moving image. A driving lane that does not contain a mobile object s photographed in advance so that a mobile object can be easily detected. Then, the process ends.
According to the vehicle color recording process, a vehicle corresponding to a desired vehicle gamily is automatically photographed arid recorded.
The present embodiment monitors the movement of a vehicle and creates a high-resolution image by capturing a moving vehicle.
Note that, the present embodiment equally divides the imaging area into the near-infrared imaging area 218 and the visible light imaging area 219 by generating a line that bisects the optical axis of the photographing lens 211, -34 -however, one of the imaging areas may be increased at the expense of the other imaging area based on a photographing condition.
The second embodiment of the inventions shown in Figs. 12 and 13 is described below. The constructions of the second embodiment that are similar to the first embodiment are indicated by the same numbers and their
descriptions have been omitted.
The second imaging system 400 comprises a second client 500 that is provided at a position for photographing an object, and a central controller 300 that is provided separately from the second client 200.
The second client 200 comprises a second imager 510, a calculator 220, a vehicle detector 230, a near-infrared light source 240, a recorder 250, and a communicator 260.
The second imager 510 comprises an imaging lens 211 that is an imaging optical system, a second COD 512 and a third COD 515 that are imaging sensors, a second visible-light blocking filter 516 that is an invisible light filter, a second polarizer 517, a second infrared blocking filter 514 that is a color filter, and a half mirror 513.
The second COD 512 and the third COD 515 have multiple pixels that exceed the count of one million pixels.
A second imaging surface 519 and a third imaging surface 520, on which an optical image passing through the imaging -35 -lens 211 is captured, are rectangular-shaped and provided on the second CCD 512 and third CCD 515, respectively.
The second CCD 512 is configured in the second client 500 so that the direction of the optical axis of the photographing lens 211 is parallel to the direction of the short side of the second imaging surface 519, and the force of gravity acts on the second client 500 in a direction that is orthogonal to the second imaging surface 519. The third CCD 515 is configured in the second client 500 sO that the direction of the optical axis of the photographing lens 211 is orthogonal to the third imaging surface 520, and the force of gravity acts on the second client 500 in the direction parallel to the short side of the third imaging surface 520. The second imaging surface 519 is the visible light imaging area, and the third imaging surface 520 is the near-infrared imaging area. The second and third CCD5 512 and 515 simultaneously capture an optical image that is incident on the second and third imaging areas 519 and 520, which are the visible imaging area 519 and the near-infrared imaging area 520. The second and third CODs 512 and 515 output a ten frame-per-second moving image to the calculator 220 and the recorder 250.
The second polarizer 517 is an optical filter that transmits only light of a predetermined polarization state, and is attached to the third CCID 515 so that it covers the -36 -whole surface of the near-infrared imaging area 520.
The second visible-light blocking filter 516 is an optical filter that transmits only near-infrared light and not visible light. The second visible-light blocking filter 516 is provided nearby the imaging lens 211 side of the second polarizer 517 so that it covers the whole surface of the second polarizer 517.
The second infrared blocking filter 514 is an optical filter that does not transmit infrared light but does transmit other light, and is provided nearby the second CCD 512 so that it covers the whole surface of the visible light imaging area 519. A thickness and refractive index of the second infrared blockng filter 514 is configured so that the light path length of the second infrared blocking filter 514 is equal to the light path length obtained by adding the light path length of the second visible-light blocking filter 516 to the light path length of the second polarizer 517.
The half mirror 513 s provided on the optical path between the photographing lens 211 and the second and third CCD5 512 and 515, so as to form a 45 degree angle with the optical axis of the photographing lens 211. The half mirror 513 reflects half of the incoming light and transmits half. Light from a distant object that passes through the photographing lens 211 is reflected by the half mirror 513 -37 -onto the visible imaging area 519. Light from a close object is transmitted through the half mirror 513 onto the near-infrared imaging area 520. The visible imaging area 519 and the near-infrared imaging area 520 are orthogonal to the optical path.
According to these constructions, the second CCD 512 captures a color image of a distant object and the third CCD 515 captures a near-infrared image of a close object (see Fig. 3) According to the present embodiment, a small and inexpensive imaging sensor is used. Additionally, multiple filters need not be provided on one imaging sensor.
The third embodiment of the inventions shown in Figs. 14 and 15 is described below. The constructions of the third embodiment that are similar to the first embodiment are indicated by the same numbers and their
descriptions have been omitted.
The third imaging system 900 comprises a third client 600 that is provided at the photographing position of an object, and a central controller 300 that is provided separate from the third client 600.
The third client 600 comprises a third imager 610, a calculator 220, a vehicle detector 230, a near-infrared light source 240, a recorder 250, and a communicator 260.
The third imager 610 comprises an imaging lens 211 -38 -that is an imaging optical system, a fourth CCD 612 and a fifth CCD 615 that are imaging sensors, a second polarizer 517, and a beam splitter 813 that is a prism used for dividing an optical path.
The fourth CCID 612 and the fifth CCID 615 are provided so that their light path lengths from the imaging lens 211 and the coated surface 818 are equal to each other.
The fourth CCID 612 and the fifth CCID 615 have multiple pixels in excess of the count of one million pixels. A fourth imaging surface 619 and a fifth imaging surface 620, upon which an optical image passing through the imaging lens 211 is captured, are rectangular and are provided on the fourth CCD 612 and fifth CCD 615, respectively. The fourth imaging surface 619 and the fifth imaging surface 620 have the same aspect ratio, area, and number of pixels.
The fourth CCD 612 is provided in the third client 600 so that the direction of the optical axis of the photographing lens 211 is parallel to the short side of the fourth imaging surface 619, and the direction of the force of gravity acting on the third client 600 is orthogonal to the fourth imaging surface 619 and the fifth imaging surface 620. The fifth CCD 615 is configured in the third client 600 such that the direction of the optical axis of the -39 -photographing lens 211 is orthogonal to the fifth imaging surface 620, and the force of gravity acts on the third client 600 in the direction that is parallel to the short side of the fifth imaging surface 620. The fourth imaging surface 619 is the visible light imaging area, and the fifth imaging surface 620 is the near-infrared imaging area.
The fourth and fifth CCID5 612 and 615 simultaneously capture an optical image that is incident on the fourth and fifth imaging areas 619 and 620, which are the visible imaging area 519 and the near-infrared imaging area 520. The fourth and fifth CCD5 612 and 615 output a 10 frame-per-second moving image to the calculator 220 and the recorder 250.
The second polarizer 517 is an optical filter that transmits only light of a predetermined polarization state, and is attached to the fifth CCD 615 so that it covers the whole surface of the near-infrared imaging area 520.
The beam splitter 813 is a quadrangular prism that has a four-sided base surface and is made by bonding together right-angle prisms 813a and 813b. The right-angle prisms 813a and 813b have the same shape as a triangular prism, and comprise the top and bottom surface of right-angle triangles and a laterai surface of a quadrangle. The lateral surface, which does not form a right angle with the top and bottom surface, is the bonding surface of a -40 -rectangle. The bonding surface of one of the right-angle prisms 813a or 813b has a rectangular coated surface 818 that reflects visible light and transmits near-infrared light. That is, the bonding surfaces of the right-angle prisms 813a and 813b are bonded together with the coated surface 818 between them. The beam splitter 813 is positioned on the light path so that the coated surface 818 faces the photographing lens 211, and the light paths of the imaging surfaces of the fourth and fifth CCD5 612 and 615 are incident to the coated surface 818.
Light incident from the photographing lens 211 onto the beam splitter 813 is divided between visible light and near-infrared light. The visible light is reflected by the coated surface 818 onto the fourth CCD 612. The near-infrared light is transmitted through the coated surface 818 onto the fifth CCD 615. A visible-light blocking filter and an infrared blocking filter may not need to be provided, because the coated surface 818 divides the visible light and the near-infrared light.
According to these constructions, the fourth CCD 612 captures a color image of a distant object, and the fifth CCD 615 captures a near-infrared image of a close object (see Fig. 3) For example, the third imager 610 is provided above a driving lane of a highway to monitor a moving vehicle in -41 -the driving lane and to photograph a vehicle that approaches from a iLong distance. The vertical direction of an optical image incident on the fourth and fifth imaging surfaces 619 and 620 is inverted so that it matches the vertical direction of an object enterng the photographing lens 211.
Therefore, when a mobile object, for example a vehicle approaching the third client 600, is detected from a comparatively long distance, an optical image of the approaching vehicle is formed on the visible light imaging area 619. As a vehicle approaches the third client 600, when it reaches a comparatively close distance an optical image of the approaching vehicle is formed on the near-infrared imaging area 620. Thereby, the third client 600 outputs a color image of an object located a comparatively long distance away, and outputs an infrared image of an object ILocated comparatively close. Please refer to Fig. 3.
The vehicle detector 230 is a loop coil that is provided in the angle of view of the third imager 610 and positioned under the surface of the road for detecting a passing vehicle.
The calculator 220 processes moving images that it receives from the fourth and fifth CCIDs 612 and 615 after receiving a vehicle detection signal from the vehicle detector 230. The calculator 220 executes the lane-change -42 -recording process, the roadside traffic recording process, the vehicle color recording process, and the vehicle family recording process during image processing. Their descriptions are described in the first embodiment and therefore have been omitted from here. In these recording processes, the calculator 200 captures images of a driver and a car registration plate that are formed on the near-infrared imaging area 620, creates a still image, and sends the still image to the recorder 250.
According to the present embodiment, a small and inexpensive imaging sensor is used to monitor the movement of a vehicle and capture an optical image of the moving vehicle in order to create a high-resolution image.
Additionally, multiple filters need not to be provided on one imaging sensor. A half mirror may cause some deterioration of an optical image because the backside of the light-receiving surface of a half mirror can reflect light, which will cause deterioration. However, the beam splitter 813 prevents deterioration of an optical image and saves the cost of providing a visible-light blocking filter and an ±nfrared blocking filter.
Note that, a light path length-adjustment filter may be provided between the fourth CCD 612 and the beam splitter 613. The light path length-adjustment filter adjusts a light path length to the fourth CCID 612, so that -43 -the light path length from the coated surface or the photographing lens 211 to the fourth CCD 612 is the same as the light path length from the coated surface or the photographing lens 211 to the fifth CCD 615. This construction saves the cost of adjusting the light path lengths.
Note that, in all embodiments the imaging sensor is not limited to a CCD and may be a CMOS, etc. The number of pixels of a CCD may not be limited to more than one million pixels. The photographing frequency may not be limited to ten frames per one second.
The near-infrared imaging area 520 and the visible light imaging area 519 may not have the same aspect ratio, area, and number of pixels, and instead may have different aspect ratios, areas, or number of pixels.
The loop coil is used for the detection of a vehicle in the present embodiment; however, as a substitute for the loop coil the following process may be used instead. The location of a vehicle is detected by calculating the velocity and direction of an optical image of a vehicle on the CCD 212, and calculating the timing of its approach to an area in the near-infrared imaging area corresponding to a photographing point based on its velocity and direction, so that a mobile object is photographed by near-infrared light.
-44 -The vehicle detector 230 may not be a loop coil, but may instead be a sensor that can detect the location of a vehicle, for example an infrared sensor or an acoustic wave sensor.
The imaging system can monitor a fruit in the place of a vehicle. It can select a fruit based on its sugar content instead of a vehicle based on its color.
Additionally, it can sort fruits based on the kind or size of fruit in place of a vehicle family.
Moreover, the imaging system can monitor a parcel in the place of a vehicle. It can sort a parcel according to its destination, size and shape.
Although the embodiment of the present invention has been described herein with references to the accompanying drawings, obviously many modifications and changes may be made by those skilled in the art without departing from the scope of the invention.
Claims (31)
- -45 -CLAIMS1. An imaging system comprising: an imaging sensor that has a first imaging area and a second imaging area that is different from the first imaging area, the imaging sensor capturing an optical image that moves on the first imaging area and the second imaging area; a color filter that is provided on the first imaging area; an invisible light filter that is provided on the second imaging area; a memory for storing an image captured by said imaging sensor; and a controller for retrieving an image from said memory and storing the retrieved image in a storage medium, and determining whether or not the optical image corresponds to a predetermined condition based on an image that was captured on either the first imaging area or the second imaging area; said imaging sensor capturing an optical image, which passes through said color filter, and outputting a color image; and capturing an optical image, which passes through said invisible light filter, and outputting an invisible light image; -46 -in the case that said controller determines that the optical image corresponds to a predetermined condition, said controller retrieving from said memory an image captured a certain period before the optical image corresponding to the predetermined condition, and storing the retrieved image in the storage medium.
- 2. An imaging system according to claim 1 further comprising a detector that detects whether or not an optical image moves from either the first imaging area or the second imaging area to the respective other area, and wherein in the case that said controller determines that the optical image corresponds to a predetermined condition and said detector detects that an optical image moves from one area to the other area, said controller retrieves from said memory an image captured a certain period of time before the optical image corresponding to the predetermined condition moves to the other area, and stores the retrieved image in the storage medium.
- 3. An imaging system according to claim 1, wherein said imaging sensor captures an optical image that moves from the first imaging area to the second imaging area, and said controller determines whether or not the optical image corresponds to a predetermined condition based on the color image, and in the case that the -47 -optical image corresponds to a predetermined condition said controller stores the invisible light image in the storage medium.
- 4. An imaging system according to claim 3, wherein said controller stores the invisible light image and the color image in the storage medium in the case that the optical image corresponds to a predetermined condition.
- 5. An imaging system according to claim 1, wherein the controller virtually divides an angle of view of said imaging sensor into multiple view areas, and in the case that the optical image repeatedly moves from one view area to another view area, said controller determines that the optical image corresponds to the predetermined condition.
- 6. An imaging system according to claim 1, wherein in the case that the optical image passes through a predetermined point in the angle of view of said imaging sensor, said controller determines that the optical image corresponds to the predetermined condition.
- 7. An imaging system according to claim 1, wherein in the case that the optical image has a predetermined color, said controller determines that the optical image corresponds to the predetermined condition.
- 8. An imaging system according to claim 1, wherein in the case that the optical image has a predetermined -48 -shape, said controller determines that the optical image corresponds to the predetermined condition.
- 9. An imaging system according to claim 1, wherein the first imaging area and the second imaging area are provided on one imaging sensor.
- 10. An imaging system according to claim 1, wherein said imaging sensor comprises a first imaging sensor and a second imaging sensor, the first imaging area is provided on the first imaging sensor and the second imaging area is provided on the second imaging sensor.
- 11. A vehicle monitoring system comprising: an imaging sensor that has a fixed angle of view and is positioned nearby a road, a first imaging area and a second imaging area that is different from the first imaging area, the imaging sensor capturing an optical image that moves in one direction on the first imaging area and second imaging area; a color filter that is provided on the first imaging area; an invisible light filter that is provided on the second imaging area; a memory for storing an image captured by said imaging sensor; and a controller for retrieving an image from said memory and storing the retrieved image in a storage -49 -medium, and for determining whether or not the optical image of a vehicle passes through a predetermined area in the angle of view of said imaging sensor based on a captured image that was captured on either the first imaging area or the second imaging area; said imaging sensor capturing an optical image, which passes through said color filter, and outputting a color image; and capturing an optical image, which passes through said invisible light filter, and outputting an invisible light image; in the case that said controller determines that the optical image of a vehicle passes through a predetermined area in the angle of view of said imaging sensor, said controller retrieving an image from said memory that was captured in a certain period before the optical image corresponding to a predetermined condition, and storing the retrieved image in the storage medium.
- 12. A vehicle monitoring system comprising: an imaging sensor with a fixed angle of view that is positioned nearby a road, and that has a first imaging area and a second imaging area that is different from the first imaging area, the imaging sensor capturing an optical image that moves in one direction on the first imaging area and the second imaging area; a color filter that is provided on the first -50 -imaging area; an invisible light filter that is provided on the second imaging area; a memory for storing an image captured by said imaging sensor; and a controller for retrieving an image from said memory, for storing the retrieved image in a storage medium, for virtually dividing an angle of view of said imaging sensor into multiple areas, and for determining whether or not the optical image of a vehicle moves from one view area to another view based on an image that was captured on either the first imaging area or the second imaging area; said imaging sensor capturing an optical image, which passes through said color filter, and outputting a color image; said imaging sensor capturing an optical image, which passes through said invisible light filter, and outputting an invisible light image; in the case that said controller determines that the optical image of a vehicle repeatedly moves from one view area to another view area, said controller retrieving from said memory an image that was captured in a certain period before the optical image corresponding to a predetermined condition was captured, and storing it -51 -in the storage medium.
- 13. An imager comprising: an imaging sensor that has a first imaging area and a second imaging area that is different from the first imaging area, the imaging sensor capturing an optical image that moves in one direction on the first imaging area and the second imaging area; a color filter that is provided on the first imaging area; and an invisible light filter that is provided on the second imaging area.
- 14. An imager according to claim 13, wherein said imaging sensor captures an optical image that moves from the first imaging area to the second imaging area.
- 15. An imager according to claim 13, wherein said imaging sensor is rectangle-shaped and configured in the imager so that the longitudinal direction of said imaging sensor is parallel to the direction of gravity, said invisible light filter is provided on a light path leading to the upper-half of said imaging sensor in the direction of gravity, said color filter is provided on a light path leading to the bottom-half of said imaging sensor in the direction of gravity, the upper-half of said imaging sensor is for -52 -converting an optical image to an invisible light image, and the lower-half of said imaging sensor is for converting an optical image to a color image.
- 16. An imager according to claim 13, further comprising a half mirror provided on a light path leading from an object to said imaging sensor, and wherein said imaging sensor has a color imaging sensor, which is the first imaging area, and an invisible light imaging sensor, which is the second imaging area, said half mirror Ls for reflecting the far side of an object image toward the color imaging sensor and for transmitting the near side of an object image toward the invisible light imaging sensor.
- 17. An imager according to claim 16 further comprising a photographing lens for directing an object image toward said imaging sensor, wherein the color imaging sensor is provided in the imager so that its imaging surface is parallel to the optical axis of the photographing lens, and the invisible light imaging sensor is provided in the imager so that its imaging surface is orthogonal to the optical axis of the photographing lens.
- 18. An imager according to claim 13, wherein the color filter does not transmit infrared light.
- 19. An imager according to claim 13, wherein the -53 -invisible light filter does not transmit visible light.
- 20. An imager according to claim 13 further comprising a polarizer that is provided on the light path that passes through said invisible light filter.
- 21. An imager according to claim 20, wherein the light path length of said color filter is the same as the light path length of said invisible light filter and said polarizer.
- 22. An imager according to claim 20, further comprising a light path length adjustment filter that is provided in the light path that passes through said color filter, and wherein the light path length of said light path length-adjustment filter is the same as the light path length of said polarizer.
- 23. An imager according to claim 22, wherein said light path length-adjustment filter is provided between said color filter and said imaging sensor.
- 24. An imager according to claim 13, further comprising a photographing lens that is a pan-focus lens for directing an object image toward said imaging sensor.
- 25. An imager comprising: a color imaging sensor for capturing visible light; an invisible light imaging sensor for capturing invisible light; and -54 -a beam splitter that is provided on a light path from an object to said color imaging sensor and said invisible light imaging sensor, and that has a coated surface for reflecting visible light and for transmitting invisible light; said color imaging sensor and said invisible light imaging sensor capturing an optical image moving from said color imaging sensor to said invisible light imaging sensor that corresponds to an object that moves from a far side to a near side.
- 26. An imager according to claim 25, wherein said beam splitter is formed by attached prisms, and the coated surface is provided on either one of the attached surfaces of the prisms.
- 27. An imager according to claim 25, wherein said color imaging sensor is provided in the imager so that its imaging surface is parallel to the optical path, and said invisible light imaging sensor is provided in the imager so that its imaging surface is orthogonal to the optical path.
- 28. An imager according to claim 25, further comprising a pan-focus photographing lens for directing an object image toward said imaging sensor.
- 29. An imager according to claim 25, further comprising a polarizer that is provided on the light path -55 -that leads from said beam splitter to said invisible light imaging sensor.
- 30. An imager according to claim 25, further comprising a light path length-adjustment filter that is provided on the light path leading from said beam splitter to said color imaging sensor, and wherein said light path length-adjustment filter is for adjusting the light path length from the coated surface to said color imaging sensor and the light path length from the coated surface to said invisible light imaging sensor so that the lengths are equal.
- 31. An imager according to claim 25, wherein the invisible light is near-infrared light.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009234340A JP2011082856A (en) | 2009-10-08 | 2009-10-08 | Imaging apparatus |
JP2009234339A JP2011082855A (en) | 2009-10-08 | 2009-10-08 | Imaging apparatus |
JP2009234303A JP2011081657A (en) | 2009-10-08 | 2009-10-08 | Imaging system |
Publications (2)
Publication Number | Publication Date |
---|---|
GB201016975D0 GB201016975D0 (en) | 2010-11-24 |
GB2474557A true GB2474557A (en) | 2011-04-20 |
Family
ID=43304260
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1016975A Withdrawn GB2474557A (en) | 2009-10-08 | 2010-10-08 | Vehicle movement detection using visible and invisible light |
Country Status (3)
Country | Link |
---|---|
CN (1) | CN102088553A (en) |
DE (1) | DE102010038056A1 (en) |
GB (1) | GB2474557A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2721828A1 (en) * | 2011-06-15 | 2014-04-23 | Microsoft Corporation | High resolution multispectral image capture |
US9781361B2 (en) | 2015-09-01 | 2017-10-03 | Delphi Technologies, Inc. | Integrated camera, ambient light detection, and rain sensor assembly |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102014206677A1 (en) * | 2014-04-07 | 2015-10-08 | Robert Bosch Gmbh | Camera system and method for detecting an environment of a vehicle |
CN104159084A (en) * | 2014-08-21 | 2014-11-19 | 中南林业科技大学 | Monitoring method based on novel monitoring lens and dual image sensors |
JPWO2017217053A1 (en) * | 2016-06-17 | 2019-01-31 | シャープ株式会社 | Image pickup apparatus and filter |
EP3502775A1 (en) * | 2017-12-19 | 2019-06-26 | DURA Operating, LLC | A visual recognition system of a vehicle, and a method for implementing such system |
CN110189540A (en) * | 2019-05-19 | 2019-08-30 | 复旦大学 | Itellectualized uptown Outdoor Parking total management system |
CN113053138A (en) * | 2020-11-13 | 2021-06-29 | 泰州无印广告传媒有限公司 | Annular LED display array driving system and corresponding terminal |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050207487A1 (en) * | 2000-06-14 | 2005-09-22 | Monroe David A | Digital security multimedia sensor |
US20060177129A1 (en) * | 2005-02-07 | 2006-08-10 | Sanyo Electric Co., Ltd. | Color signal processing method |
EP1723464A1 (en) * | 2004-03-10 | 2006-11-22 | Raytheon Company | Dual-band sensor system utilizing a wavelength-selective beamsplitter |
JP2007150826A (en) * | 2005-11-29 | 2007-06-14 | Alpine Electronics Inc | Imaging apparatus and vehicle peripheral image providing apparatus |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2633090B1 (en) | 1988-06-16 | 1992-07-31 | Cogema | PROCESS FOR SEPARATING USING CROWN ETHERS URANIUM AND PLUTONIUM FROM AN AQUEOUS MEDIUM FROM THE PROCESSING OF IRRADIATED NUCLEAR FUELS |
JP2001257923A (en) | 2000-03-14 | 2001-09-21 | Denso Corp | Image pickup device |
JP2009234339A (en) | 2008-03-26 | 2009-10-15 | Stanley Electric Co Ltd | Turn signal lamp for vehicles, and vehicular turn signal device |
JP5013208B2 (en) | 2008-03-26 | 2012-08-29 | 株式会社デンソー | Air conditioner control method and air conditioner control apparatus |
JP4949301B2 (en) | 2008-03-26 | 2012-06-06 | カルソニックカンセイ株式会社 | Radiator core support |
-
2010
- 2010-10-08 DE DE102010038056A patent/DE102010038056A1/en not_active Withdrawn
- 2010-10-08 GB GB1016975A patent/GB2474557A/en not_active Withdrawn
- 2010-10-08 CN CN2010105371921A patent/CN102088553A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050207487A1 (en) * | 2000-06-14 | 2005-09-22 | Monroe David A | Digital security multimedia sensor |
EP1723464A1 (en) * | 2004-03-10 | 2006-11-22 | Raytheon Company | Dual-band sensor system utilizing a wavelength-selective beamsplitter |
US20060177129A1 (en) * | 2005-02-07 | 2006-08-10 | Sanyo Electric Co., Ltd. | Color signal processing method |
JP2007150826A (en) * | 2005-11-29 | 2007-06-14 | Alpine Electronics Inc | Imaging apparatus and vehicle peripheral image providing apparatus |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2721828A1 (en) * | 2011-06-15 | 2014-04-23 | Microsoft Corporation | High resolution multispectral image capture |
EP2721828A4 (en) * | 2011-06-15 | 2014-05-14 | Microsoft Corp | High resolution multispectral image capture |
US9635274B2 (en) | 2011-06-15 | 2017-04-25 | Microsoft Technology Licensing, Llc | High resolution multispectral image capture |
US9992457B2 (en) | 2011-06-15 | 2018-06-05 | Microsoft Technology Licensing, Llc | High resolution multispectral image capture |
US9781361B2 (en) | 2015-09-01 | 2017-10-03 | Delphi Technologies, Inc. | Integrated camera, ambient light detection, and rain sensor assembly |
Also Published As
Publication number | Publication date |
---|---|
GB201016975D0 (en) | 2010-11-24 |
DE102010038056A1 (en) | 2011-05-26 |
DE102010038056A8 (en) | 2011-11-10 |
CN102088553A (en) | 2011-06-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
GB2474557A (en) | Vehicle movement detection using visible and invisible light | |
US6977674B2 (en) | Stereo-image capturing device | |
US7646550B2 (en) | Three-channel camera systems with collinear apertures | |
US7372642B2 (en) | Three-channel camera systems with non-collinear apertures | |
US7819591B2 (en) | Monocular three-dimensional imaging | |
EP1984785B1 (en) | Monocular three-dimensional imaging | |
JP3726699B2 (en) | Optical imaging device, optical distance measuring device | |
JP5827988B2 (en) | Stereo imaging device | |
CN106796293A (en) | Linear model calculates sensing LADAR | |
JP2017106897A (en) | Light detection and ranging (lidar) imaging systems and methods | |
EP1408702A2 (en) | Three-dimensional photographing apparatus, three-dimensional photographing method and stereo adapter | |
JP2002369049A (en) | Image detector and aperture device | |
US10852436B2 (en) | Imaging system and method for monitoring a field of view | |
US20240353265A1 (en) | Systems and Methods for Infrared Sensing | |
JP3986748B2 (en) | 3D image detection device | |
JP2010152026A (en) | Distance measuring device and object moving speed measuring device | |
JP2011082855A (en) | Imaging apparatus | |
CN110392818B (en) | Distance measuring device, head-mounted display device, portable information terminal, image display device, and periphery monitoring system | |
US20200128188A1 (en) | Image pickup device and image pickup system | |
JP2011082856A (en) | Imaging apparatus | |
JP4085720B2 (en) | Digital camera | |
JP2011081657A (en) | Imaging system | |
JP3620710B2 (en) | Imaging device | |
KR100403762B1 (en) | apparatus for photograping and measuring speed | |
JP2001119723A (en) | Still object image detector and still object image detection method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
732E | Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977) |
Free format text: REGISTERED BETWEEN 20120419 AND 20120425 |
|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |