WO2006013765A1 - 人物判定装置及び人物検索追跡装置 - Google Patents
人物判定装置及び人物検索追跡装置 Download PDFInfo
- Publication number
- WO2006013765A1 WO2006013765A1 PCT/JP2005/013769 JP2005013769W WO2006013765A1 WO 2006013765 A1 WO2006013765 A1 WO 2006013765A1 JP 2005013769 W JP2005013769 W JP 2005013769W WO 2006013765 A1 WO2006013765 A1 WO 2006013765A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- walking
- information
- person
- image
- sequence
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
Definitions
- the present invention relates to a person determination apparatus that determines whether or not persons included in different image sequences are the same person, and an apparatus for searching and tracking a person using the person determination apparatus.
- Patent Document 1 1)
- FIG. 1A to FIG. 1C are diagrams for explaining a person search / tracking method described in Patent Document 1.
- FIG. 1A and FIG. 1B show temporally continuous frame images taken of the same person.
- a frame image A10 shown in FIG. 1A shows an image of a person Al moving in the right direction.
- the rectangle A12 is a circumscribed rectangle including the head and torso parts detected as a region where the motion is small (person stable region) using the motion vector.
- the circumscribed rectangle A22 of the human stable area detected from the person A21 is indicated by a broken line.
- Patent Document 1 Japanese Patent Application Laid-Open No. 2003-346159 (3rd and 6th pages, FIGS. 2 and 9) Disclosure of Invention
- the present invention solves the above-described conventional problem, and can determine the same person even between frames that are separated in time or between frames shot by different cameras.
- Another object of the present invention is to provide a person search / tracking apparatus that searches for and tracks a person using the person determination apparatus.
- a person determination device is a person determination device that determines whether or not persons included in different image sequences are the same.
- a sequence and an image sequence input means for receiving an input of a second image sequence acquired by a different time sensor or a different image sensor from the first image sequence, and the input first and second image sequences.
- the walking sequence extracting means for extracting the first and second walking sequences, which are image sequences indicating the walking state of the person, and the extracted first and second walking sequences, respectively,
- Walking information extracting means for extracting the first and second walking information which is information for identifying the periodic movement of the walking, and the extracted first and second walking
- a walking information collating means for collating the distribution, based on the collation result by the walking information collating unit, Te, a force whether a person with each other included in the first and second image sequence are identical
- determining means for determining.
- walking information for example, information indicating a temporal or spatial walking cycle of a person, temporal
- a human image sequence obtained between different frames or from different sensors is associated. be able to.
- a person determination device that can determine the same person and a person search / tracking device that searches and tracks a person using the person determination device are realized.
- FIG. 1A is a diagram showing an example of a detection rectangle in a conventional person search / tracking apparatus.
- FIG. 1B is a diagram showing an example of a detection rectangle in a conventional person search / tracking apparatus.
- FIG. 1C is a diagram showing movement of a detection rectangle in a conventional person search / tracking apparatus
- FIG. 2 is a functional block diagram showing a configuration of a person determination device according to Embodiment 1 of the present invention.
- FIG. 3 is a diagram showing an example of an image sequence in the first embodiment of the present invention.
- FIG. 4A is a diagram showing an example of a sequence of human lower body images.
- FIG. 4B is a diagram showing an example of a walking sequence in the first embodiment of the present invention.
- FIG. 4C is a diagram showing an example of the shape of the minimum pattern of the walking sequence.
- FIG. 5 is a diagram showing spatiotemporal phase information and spatiotemporal position information in Embodiment 1 of the present invention.
- FIG. 6A is a diagram showing an example of a walking locus in the first embodiment of the present invention.
- FIG. 6B is a diagram showing an example of the estimated walking state in the first embodiment of the present invention.
- FIG. 6C is a diagram showing an example of walking states with different phase states in Embodiment 1 of the present invention.
- FIG. 7A is a diagram showing an example of the relationship between the walking locus and the change between the legs in the first embodiment of the present invention.
- FIG. 7B is a diagram showing an example of a change between legs in the first embodiment of the present invention.
- FIG. 7C is a diagram showing an example of a change between legs in the first embodiment of the present invention.
- FIG. 8A is a diagram showing an example of a display of a walking sequence in the first embodiment of the present invention.
- FIG. 8B is a diagram showing an example of a walking sequence display in the first embodiment of the present invention.
- FIG. 8C is a diagram showing an example of a walking sequence display in the first embodiment of the present invention.
- FIG. 9 is a diagram showing an example of a storage format of a result of searching and tracking a walking sequence according to Embodiment 1 of the present invention.
- FIG. 10 is a block diagram showing a configuration of a person search / tracking apparatus according to Embodiment 1 of the present invention.
- FIG. 11 is a diagram showing an example of a screen for performing a search and tracking instruction in the first embodiment of the present invention.
- FIG. 12 is a flowchart showing an example of a verification procedure in the first embodiment of the present invention.
- FIG. 13A is a diagram showing an example of an image sequence 1 in the first embodiment of the present invention.
- FIG. 13B is a diagram showing an example of an image sequence 2 in the first embodiment of the present invention.
- FIG. 14 shows an example of a walking sequence extraction procedure according to the first embodiment of the present invention. It is a flowchart to show.
- FIG. 15 is a flowchart showing an example of a procedure for extracting spatio-temporal period information 'spatio-temporal phase information-spatio-temporal position information in Embodiment 1 of the present invention.
- FIG. 16A is a diagram showing an example of detection of a specific walking state in the first embodiment of the present invention.
- FIG. 16B is a diagram showing an example of a detection template for a specific walking state in the first embodiment of the present invention.
- FIG. 16C is a diagram showing an example of a specific walking state detection process in the first embodiment of the present invention.
- FIG. 17 is a diagram showing an example of display by the control unit in Embodiment 1 of the present invention.
- FIG. 18A is a diagram showing an example of an image sequence 1 in the second embodiment of the present invention.
- FIG. 18B is a diagram showing an example of an image sequence 2 in the second embodiment of the present invention.
- FIG. 19 is a functional block diagram showing a configuration of a person determination device according to Embodiment 2 of the present invention.
- a person determination apparatus is a person determination apparatus that determines whether or not humans included in different image sequences have the same power, and includes a first image sequence and the first image sequence.
- An image sequence input means for accepting an input of a second image sequence acquired at a different time or a different image sensor from the one image sequence, and a human character from the input first and second image sequences, respectively.
- a walking sequence extraction means for extracting the first and second walking sequences, which are image sequences indicating the walking state, and a periodic movement of the person walking from the extracted first and second walking sequences, respectively.
- the walking information extracting means for extracting the first and second walking information, which is information for identifying the walking information, and the walking information for comparing the extracted first and second walking information.
- An information verification unit and a determination unit that determines whether or not the persons included in the first and second image sequences are the same based on a verification result by the walking information verification unit.
- walking information for example, a person's time or space Information indicating a typical walking cycle, temporal or spatial phase information in a periodic walking motion of a person, or temporal or spatial positional information indicating a periodic walking motion of a person. May be.
- the present invention makes use of the fact that walking characteristics such as the walking cycle and step length differ for different people, and that the same person walks with constant walking characteristics.
- the same person can be determined without depending on the spatial and spatial positions, and the same person can be determined between frames that are separated in time or frames taken by different cameras.
- the walking information collating means may add the first and second image sequences to the first and second image sequences based on the first and second spatiotemporal phase information included in the first and second walking information, respectively.
- the walking information may be collated by comparing the time or position at which the included person assumes a predetermined walking posture.
- the walking information matching unit is configured to determine the first image of the person included in the first image sequence based on the first spatiotemporal phase information included in the first walking information.
- a phase information estimator for estimating spatio-temporal phase information at a time or position different from a sequence, and based on the spatio-temporal phase information and the second spatio-temporal phase information estimated by the phase information estimating means,
- a walking information matching unit that compares walking information by comparing the time or position at which the person included in the first and second image sequences assumes a predetermined walking posture;
- the walking information collating unit is configured to perform the first and second spatiotemporal phase information included in the first and second walking information, respectively, based on the first and second spatiotemporal phase information.
- the walking information may be collated by comparing the walking postures of the persons included in the first and second image sequences.
- the walking information collating means is based on the phase information included in the first spatiotemporal walking information! / And the first information of the person included in the first image sequence.
- a walking information matching unit that compares walking information by comparing the walking postures of the persons included in the first and second image sequences at the same time or position.
- the walking sequence is, for example, an image on a cut surface when each image sequence is cut along the time axis. Specifically, both legs of a person included in each image sequence are displayed. It is an image obtained by arranging cut sections on the time axis. This makes it possible to extract the walking characteristics of a person in terms of time and space.
- the image sequence input means may accept input of first and second image sequences acquired by different image sensors that photograph the same place. This makes it possible to identify a person in different image sequences obtained by photographing the same place where the blind spot exists from different angles.
- the person determination device further includes correction information storage means for preliminarily storing correction information indicating a correspondence relationship between the position on the image and the position at the shooting location in each of the first and second image sequences. And correction means for performing a spatiotemporal correction process in the extraction of the first and second walking information by the walking information extraction means based on the correction information stored in the correction information storage means. Good. As a result, even when images are acquired by different image sensors, inconsistencies between images based on differences in image sensor placement position, shooting direction, etc. are corrected. Is possible.
- the walking surface at the shooting location is set at 2 intervals. It may be information that identifies grid lines that are dimensionally separated.
- a person search / tracking apparatus is a person search / tracking apparatus that searches for or tracks a specific person in an image sequence in which a person is captured.
- the first and second walking information corresponding to the first and second walking information when the first and second walking information are matched by the person determination device and the walking information matching means provided in the person determination device.
- walking sequence storage means for storing the second walking sequences in association with each other.
- a person search / tracking apparatus is a person search / tracking apparatus that searches or tracks a specific person in an image sequence in which a person is captured.
- a person determination device ; and display means for displaying the first and second image sequences received by the image sequence input means included in the person determination device, wherein the display means includes the first and second image sequences.
- highlighting is performed to distinguish a person determined to be the same person by a determination unit included in the person determination apparatus from other persons. As a result, even when different images are displayed at the same time, the same person can be immediately known by highlighting, and the search / tracking of the person is facilitated.
- a person determination device is a person determination device that determines whether or not persons included in different image sequences have the same power, and includes a first determination unit included in the image sequence. And a walking sequence detection means for detecting the first and second walking sequences, which are image sequences showing the walking state of the second person, and a time different from the walking sequence from the walking sequence of the first person or A walking posture transition estimating means for estimating information indicating the transition of the walking posture in the periodic walking motion of the first person at the position, the information indicating the estimated walking posture transition of the first person, and the first A determination means for determining consistency with information indicating the transition of the walking posture of the second person, and determining that the first person and the second person are the same when having consistency; Is provided.
- the same person is determined by paying attention to the walking of the person. Therefore, it is determined that the person close to the frame is the same person, or the same person is determined by the color and the image pattern.
- the present invention makes use of the fact that walking characteristics such as the walking cycle and step length differ for different people, and that the same person walks with constant walking characteristics.
- the same person can be determined without depending on the spatial and spatial positions, and the same person can be determined between frames that are separated in time or frames taken by different cameras.
- the present invention can be realized as a person determination method and a person search tracking method that can only be realized as the person determination device and the person search tracking device as described above, or cause the computer to execute the method. It can be realized as a program or a computer-readable recording medium on which the program is recorded.
- FIG. 2 is a functional block diagram showing a configuration of person determination device 10 in the present embodiment.
- This person determination apparatus 10 is an apparatus that determines whether or not the persons included in different sequences are the same person by paying attention to the continuity of the person's walking sequence.
- the state estimation unit 180 and the determination unit 190 are configured.
- the walking posture detection unit 200 is a processing unit that detects a walking sequence including a predetermined walking posture of the first person from a moving image.
- the walking state estimation unit 180 estimates the walking state of the first person at different time or position from the walking sequence of the first person (the transition state of the posture in the walking cycle movement). It is a processing unit.
- the determination unit 190 determines the consistency between the walking state of the first person and the walking state of the second person, and if there is consistency, the first person and the second person are the same. It is a processing unit that determines that
- the walking posture detection unit 200 includes an image sequence input unit 100 and a walking sequence extraction unit 1 Consists of ten.
- the walking state estimation unit 180 includes a spatiotemporal period information extraction unit 120, a spatiotemporal phase information extraction unit 121, and a spatiotemporal position information extraction unit 122.
- the determination unit 190 includes a spatio-temporal period information storage unit 130, a spatio-temporal phase information storage unit 131, a spatio-temporal position information storage unit 132, a spatio-temporal period verification unit 140, a spatio-temporal phase verification unit 141, and a spatio-temporal difference extraction. Part 142, coincidence determination part 150, and control part 160.
- the walking posture detection unit 200 is an example of a walking sequence detection unit that detects a walking sequence that is an image sequence indicating a walking state of a person included in the image sequence.
- the walking state estimation unit 180 is an example of a walking posture transition estimation unit that estimates information indicating a transition of a walking posture in a periodic walking motion of a person at a time or position different from the walking sequence from the walking sequence.
- the determination unit 190 determines the consistency of information indicating the transition of the walking posture of two persons captured at different times or different image sensors, and determines whether the two persons are the same person or not. This is an example of the judgment means.
- the “information indicating the transition of the walking posture” is information including period information and phase information described later.
- the image sequence input unit 100 is an example of an image sequence input unit that accepts input of first and second image sequences acquired at different times or different image sensors.
- the walking sequence extraction unit 110 is a walking sequence extraction unit that extracts the first and second walking sequences, which are image sequences indicating the walking state of the person, from the first and second image sequences, respectively. It is an example.
- the spatio-temporal phase information extraction unit 121, the spatio-temporal position information extraction unit 122, and the spatio-temporal period information extraction unit 120 respectively perform periodic movements regarding the walking of the person from the first and second walking sequences. It is an example of walking information extracting means for extracting first and second walking information that is information to be specified.
- the spatiotemporal phase matching unit 141, the spatiotemporal difference extracting unit 142, and the spatiotemporal period matching unit 140 are an example of walking information matching means for matching the extracted first and second walking information.
- Match determination section 150 is included in the first and second image sequences based on the collation result. It is an example of the determination means which determines whether the persons to be read are the same person.
- the image sequence input unit 100 is a signal interface or the like for acquiring an image sequence by using a camera or an image recording apparatus.
- An “image sequence” is an array in which frame images taken as shown in FIG. 3 are arranged in time order.
- the walking sequence extraction unit 110 is a processing unit that also extracts the walking sequence from the image sequence acquired by the image sequence input unit 100.
- the “walking sequence” is a sequence of walking states in which the walking motion region force in each frame image is also obtained.
- Figure 4B shows an example of a walking sequence.
- Figure 4A shows an array of human lower body regions extracted from each frame and arranged in chronological order.
- Fig. 4B shows the frames of each frame image arranged in chronological order at the position of the broken line B10!
- the black band in Fig. 4B shows the movement trajectory of the foottips (images taken along the time axis of sections that cut both legs of the person). A specific method for calculating each walking sequence will be described later.
- the spatiotemporal period information extraction unit 120 is a processing unit that extracts the period information of the spatiotemporal change of the walking sequence force walking extracted by the walking sequence extraction unit 110.
- Period information refers to the number of steps per fixed time, a result obtained by performing frequency analysis on the spatio-temporal position change of a specific part of a foot or hand, This means the shape of the minimum pattern that is repeated.
- An example of spatiotemporal period information is illustrated in Figure 4B. In Fig. 4B, black triangle marks and white triangle marks are displayed at spatio-temporal points where the toes intersect on the time axis and the horizontal axis of the image (corresponding to the spatial axis).
- the spatio-temporal period information is an example of the spatio-temporal period information based on the spatial interval between adjacent black triangles and the time interval between adjacent white triangles, which is a fixed time period.
- the shape of the walking pattern surrounded by the broken line at the adjacent black triangle position and the broken line at the adjacent white triangle position may be used, or the step length of time as shown by curve B11 in Fig. 4B.
- Frequency characteristics (such as spectrum intensity in a specific frequency band) obtained by performing frequency analysis on stride spatial changes such as curve B12.
- the spatiotemporal cycle information storage unit 130 is a memory or the like that stores the spatiotemporal cycle information extracted by the spatiotemporal cycle information extraction unit 120 together with the detected time, the position in the image, and the like.
- the spatio-temporal period verification unit 140 includes the spatio-temporal period information extracted by the spatio-temporal period information extraction unit 120. This is a processing unit that collates the information with the spatiotemporal cycle information stored in the spatiotemporal cycle information storage unit 130.
- the spatiotemporal phase information extraction unit 121 is a processing unit that extracts the phase information of the spatiotemporal change of the walking sequence force walking extracted by the walking sequence extraction unit 110.
- Phase information means a transition state during a walking motion that is a periodic motion (a position and time at which a predetermined walking posture is taken, or a walking posture at a specific position and time). For example, even in a walking sequence having the same spatiotemporal period, information on which spatiotemporal position the foot is on the ground (predetermined walking posture) is spatiotemporal phase information. The difference in walking posture in two walking sequences compared at the same time or at the same position is also spatiotemporal phase information. Examples of spatiotemporal phase information will be described with reference to FIGS.
- FIG. 5 shows the walking trajectory of the foot position as in FIG. 4B, but shows two walking trajectories A010 and A011 having the same spatiotemporal period and different spatiotemporal phases.
- the walking trajectories A010 and A011 have the same step length and walking cycle, but the position and time at which the foot touches the ground, or the position and time at which both feet cross each other are different.
- Fig. 6A shows two walking loci (discontinuous detection of the first person, 1802 (dashed line) and second person's walking loci 1801 (solid line), which are detected discontinuously due to the presence of the obstacle 1800 (shaded area).
- phase information is a position and time at which a predetermined walking posture is taken, or a walking posture at a predetermined position and time, and the second person's walking track 1801 and the first person's walking trajectory.
- the spatio-temporal phase information extraction unit 121 determines the position / time of the predetermined posture (eg, crossing of legs) of the first person's walking information 1802 and its periodic force. Estimate the posture (dashed line 1803 in Figure 6B).
- the spatiotemporal phase information extraction unit 121 estimates the walking posture (dashed line 1805) at another time position from the walking trajectory 1804 (dashed line) on the right side of the obstacle 1800.
- the spatiotemporal phase information extraction unit 121 obtains the time and position at which the predetermined walking posture is reached.
- a state where the legs cross each other as a predetermined posture (a state where the size between the legs is minimized).
- Figure 7A shows the time or position change between legs. Between the legs in the walking trajectory The width of the trajectory of both legs can be obtained from the image.
- the spatio-temporal phase information extraction unit 121 at the time and position where the first person's inter-leg information 1902a (broken line) is not photographed due to the presence of obstacles 1900a (hatched area) as shown in FIG. 7B. Estimate the state between legs (posture state) 1 903a (dashed line).
- the spatio-temporal phase information extraction unit 121 first obtains, as phase information, the time or position at which a predetermined posture is first obtained on the left side of the image as the first region or position in time as the imaged region.
- the time or position 1905 is obtained as phase information for the estimated walking state 1903a of the first person
- the time or position 1906 is obtained for the walking state 1901a of the second person.
- phase information 1910 is obtained for the walking state 1909a (broken line) estimated for the walking state 1908a (broken line)
- phase information 1911 is obtained for the walking state 1907a (solid line).
- the spatiotemporal phase information extraction unit 121 similarly obtains the walking state or the estimated walking state at the predetermined time or position.
- the spatio-temporal phase information extraction unit 121 obtains a leg interval (walking posture) at a predetermined time or position 1904.
- the phase information for the first person is a value at the predetermined time or position 1904 of the estimated walking state 1903a (dashed line)
- the phase information for the second person is the predetermined time or position of the walking state 1901a (solid line) 1904. The value in Similarly, in FIG.
- the value at the predetermined time or position 1904 of the estimated walking state 1909a estimated based on the walking state 1908a and the value at the predetermined time or position 1904 of the walking state 1907a are respectively the first person and It becomes the phase information of the second person.
- the estimated posture is used only for the first person, but the estimated posture is also obtained for the second person, and the phase information is obtained from the estimated posture for both the first person and the second person. You may make it do.
- the position of the obstacle in FIGS. 6A to 6C and FIGS. 7A to 7C and the out-of-range shown in the figure are regarded as common positions, and after estimating the estimated state at the common position, the phase information is obtained. Also good.
- a position where the distance between the legs may be maximized or a change between the legs may be maximized may be used.
- the spatio-temporal phase information storage unit 131 includes the spatio-temporal position extracted by the spatio-temporal phase information extraction unit 121.
- the spatiotemporal phase collation unit 141 is a processing unit that collates the spatiotemporal phase information extracted by the spatiotemporal phase information extraction unit 121 with the spatiotemporal phase information stored in the spatiotemporal phase information storage unit 131. is there.
- the spatiotemporal position information extraction unit 122 extracts a spatiotemporal position from which the walking sequence is extracted from the walking sequence extracted by the walking sequence extraction unit 110, and generates a spatiotemporal position information. It is. “Spatio-temporal position information” means the time and place where the walking sequence was detected. An example of spatio-temporal position information will be described with reference to FIG. In Fig. 5, the time-spatial position where both feet crossed first in time for each of the two walking trajectories is represented by a dashed cross. Thus, information indicating the position of absolute temporal 'spatial walking is spatio-temporal position information.
- the spatio-temporal position information storage unit 132 is a memory or the like that stores the spatio-temporal position information generated from the spatio-temporal position information extraction unit 122.
- the spatio-temporal difference extraction unit 142 is a processing unit that obtains a difference between the spatio-temporal position information generated by the spatio-temporal position information extraction unit 122 and the spatio-temporal position information stored in the spatio-temporal position information storage unit 132. is there.
- the coincidence determination unit 150 is a processing unit that determines coincidence / mismatch between different walking sequences based on the results of the spatiotemporal period collation unit 140, the spatiotemporal phase collation unit 141, and the spatiotemporal difference extraction unit 142. . That is, the coincidence determination unit 150 determines whether or not the walking sequence is the same person.
- An example of a method for determining coincidence / non-coincidence is described as follows for spatio-temporal period information.
- the coincidence determination unit 150 determines that the two walking sequences coincide. That is, the coincidence determination unit 150 determines that the walking sequence is the same person.
- the coincidence determination unit 150 determines the spatio-temporal phase information and the spatio-temporal position information in the same manner as the spatio-temporal periodic information. If all items match or the specified number of items match, If you do, you can determine that the two walking sequences match.
- the determination method is not limited to the above, and a method generally used for pattern recognition or the like can be applied.
- temporal phase information tpl and spatial phase information (positional phase information) spl obtained from the walking sequence of the first person in the moving image, and these phase information
- the temporal phase information tp2 and the spatial phase information (positional phase information) sp2 obtained from the walking sequence of the second person at a different time or position from the obtained time or position are compared.
- the coincidence determination unit 150 performs I tpl—tp2 I ⁇ 0 t (0 t is a predetermined threshold value) and I spl—sp2 I ⁇ 0 s (0 t is a predetermined threshold value).
- the coincidence determination unit 150 matches the spatiotemporal phase information of the walking state 1801 and the walking state 1803, and the first person And the second person are determined to be the same.
- the coincidence determination unit 150 determines that the walking state 1806 and the walking state 1805 have different spatiotemporal phase information, and the first person and the second person are different.
- the coincidence determination unit 150 determines that the walking state 1901a and the walking state 1903a in FIG. 7B have the same spatio-temporal phase information, and the first person and the second person are the same.
- the coincidence determination unit 150 determines that the walking state 1907a and the walking state 1909a in FIG. 7C have different spatiotemporal phase information and the first person and the second person are different.
- the control unit 160 is a processing unit that displays an image sequence used for matching as control based on the result of matching determination by the matching determination unit 150.
- Figures 8A to 8C show display examples.
- FIG. 8A shows an example in which two compared image sequences are displayed with the person's area enlarged to the left and right, and the shooting time and location of each image sequence are displayed together.
- FIG. 8B shows an example in which the same result as in FIG. 8A is displayed together with the movement locus of the person (arrow in the figure).
- the shooting time and location are displayed in text in accordance with the image sequence. However, for the location, a map is displayed and the shooting position and movement trajectory are superimposed and displayed. May be.
- FIG. 8A shows an example in which two compared image sequences are displayed with the person's area enlarged to the left and right, and the shooting time and location of each image sequence are displayed together.
- FIG. 8B shows an example in which the same result as in FIG. 8A is displayed together with the movement locus of the
- the control unit 160 In addition to the display, the contents of the control by means of may include storing, in an external storage device (location information storage device), connection information between the walking sequences indicating that different walking sequences match. In this way, by storing the correspondence relationship where different walking sequences match and do not match, it can be used when searching and tracking human walking images.
- an external storage device location information storage device
- Fig. 9 shows an example of the storage format of the result of searching and tracking the walking sequence by the above procedure.
- Figure 9 shows information for three walking sequences, including five items of information for one walking sequence.
- the five items are the sequence number, spatiotemporal period information, spatiotemporal phase information, spatiotemporal position information, and coincidence sequence number.
- the sequence number is an ID number assigned to each walking sequence with different shooting times and shooting cameras.
- the spatiotemporal period information expresses the number of steps X within a certain time and the number of steps y within a certain distance as (x, y).
- the spatio-temporal phase information indicates the amount of movement from the reference time or space position to the time at which the legs first crossed each other's space position. And expressed as a ratio when 1.0. For example, in the case of sequence number 1, it means that the legs cross at a time of 0.5 steps from the reference time and the legs cross at a distance of 0.1 steps from the reference position. .
- the spatio-temporal position information indicates the time and place where the walking sequence is first detected.
- the location shows the pixel coordinate values in the vertical and horizontal directions on the image.
- the matching sequence number indicates the sequence number of the walking sequence determined to match with respect to the different walking sequences by the search and tracking. If there is no matching sequence, it may be set to 0, and if there are a plurality of sequences, a plurality may be written together. By storing the result of “search once” and tracking, it is possible to omit the match determination by referring to the information when the same search and tracking is performed again.
- sequence number may be associated with the reference destination (file name, storage address, etc.) of the captured image.
- the spatial location information in the spatio-temporal position information may be expressed in a dedicated coordinate system as shown in Fig. 9, or may be expressed in a general-purpose coordinate system such as latitude and longitude. ⁇ .
- FIG. 10 is a block diagram showing a configuration of a person search / tracking apparatus 20 to which the person determination apparatus 10 is applied.
- This person search / tracking apparatus 20 is a specific example of a system or apparatus that realizes a method for searching and tracking an image sequence person in the present embodiment, and includes a camera 1 010 and 1020, a clock 1030, a storage device 1040, a processing The apparatus includes a device 1050, a display device 1060, an input unit 1070, and a pointing device 1080.
- the cameras 1010 and 1020 are an example of the image sequence input unit 100 and take an image including the person 1000.
- the clock 1030 is a timer for obtaining the shooting time.
- the storage device 1040 is a hard disk or the like that stores images taken by the cameras 1010 and 1020, a shooting time obtained from the clock 1030, and a person search / tracking result.
- the processing device 1050 is a device that performs processing for searching and tracking an image sequence person obtained from the cameras 1010 and 1020 or the storage device 1040, and corresponds to the person determination device 10 shown in FIG.
- the display device 1060 is a display that displays the processing result of the processing device 1050
- the input unit 1070 is a keyboard used for searching and tracking instructions
- the pointing device 1080 is a mouse and the like used for searching and tracking instructions. is there.
- Figure 11 shows an example of pointing.
- searching or tracking the target person 1091 on the image is designated using the pointer 1090 as shown in FIG. 11, and the same walking sequence as the person 1091 is searched and tracked.
- Each component is connected through a communication path.
- the communication path may include a dedicated line or a public line, which may be wired or wireless.
- FIGS. 13A and 13B Examples of image sequence 1 and image sequence 2 are shown in FIGS. 13A and 13B, respectively.
- FIG. 13A shows an image sequence 1 and shows a sequence in which a person 502 is walking in the right direction on the left side of the obstacle 501.
- FIG. 13B shows an image sequence 2 and shows an image sequence taken at the same place as the image sequence 1 after 10 seconds with the same camera.
- the image sequence 2 includes an obstacle 501, a person walking in the right direction, and a person 503 and a person 504.
- the image sequence input unit 100 inputs the image sequence 1 (step S401).
- the walking sequence (walking sequence 1) is extracted from image sequence 1 (step S402).
- a walking sequence the case of using the sequence of the lower body area as shown in Fig. 4A will be described.
- the walking sequence extraction unit 110 extracts a walking sequence as well as an image sequence force.
- the walking sequence extraction unit 110 reads one frame image from the image sequence (step S601). Frame images to be read are processed in chronological order from unread frame images.
- the walking sequence extraction unit 110 detects a human region from the read frame image (step S602).
- a human area detection method a difference method between frames generally used for detecting a moving object or a background difference method in which a background image without a person is prepared in advance and a difference from the background image is calculated is used.
- the human region can be detected by extracting a region having a high height.
- the walking sequence extraction unit 110 extracts information representing the walking state from the image of the human region (step S603).
- the information representing the walking state is information representing the temporal transition of the walking state, such as the track information of the toe part in FIG. 4B.
- step S602 and step S603 may be performed simultaneously as a series of processing, or the processing result of step S602 may be used as it is as the processing result of step S603. Further, the processing may be performed so that the output in step S603 can be obtained directly without explicitly detecting the entire human region as in step S602. For example, in the case of the above-mentioned lower body image, the walking state information may be directly acquired using a template matching method using the lower body image as a template.
- the walking sequence extraction unit 110 determines whether or not the currently read frame image is the last frame (step S604) .If the frame image is the last frame, the walking sequence extraction process ends and the frame image remains. If V, return to step S601.
- the spatiotemporal period information extraction unit 120, the spatiotemporal phase information extraction unit 121, and the spatiotemporal position information extraction unit 122 respectively perform spatiotemporal period information, spatiotemporal information, and so on. Phase information and spatio-temporal position information are extracted (step S403).
- the spatiotemporal period information extraction unit 120 and the like detect the position of the specific walking state from the information of the walking sequence (step S701).
- the specific walking state will be explained using FIGS. 16A to 16C.
- Fig. 16A shows the result of detecting the specific walking state position for the walking sequence of Fig. 4B.
- the two black corrugated bands that intersect each other in Fig. 16A represent the temporal trajectory of the toes.
- the horizontal axis represents the horizontal position of the image
- the vertical axis represents time.
- the position where two legs cross each other, that is, the position where both feet intersect is defined as the specific walking state position.
- the position of the intersection can be detected by preparing the shape pattern of the intersection as shown in FIG. 16B and performing the template matching to perform correlation calculation.
- Figure 16C shows an example of the detection process.
- the degree of coincidence of the shape is calculated while shifting the position of the detection template 801 with respect to the walking locus 800. If the matching degree by matching is equal to or greater than a predetermined value, it is determined that the user is in a specific walking state. In this way, the intersection position of the broken line in FIG. 16A is obtained.
- the specific walking state is not limited to when the legs cross, but may be the state where the legs are most widened.
- the state where the legs are most widened corresponds to the position of the widest interval between the intersecting bands on the walking locus in FIG. 16A (the dashed line in the figure).
- Step S701 is performed until all the specific walking states are detected, and then the process proceeds to step S703 (step S702).
- the spatiotemporal period information extraction unit 120 generates spatiotemporal period information by calculating the interval between the detected specific walking state positions (step S703).
- the period information may be calculated separately without obtaining the position of the specific walking state using Fourier analysis, wavelet analysis, or autocorrelation method.
- the Fourier transform, wavelet transform, or autocorrelation method may be applied to the temporal change in the position of the corrugated band in FIG. 16A and the spatiotemporal change in the width of the two bands.
- the shape of the minimum unit of the periodic walking locus in FIG. 16A may be used as a pattern. In this case, the pattern has one wave shape of two bands in the range surrounded by vertical and horizontal broken lines in FIG. 16A.
- the spatiotemporal phase information extraction unit 121 calculates spatiotemporal phase information (step S704).
- Spatio-temporal phase information is the force past the reference time and space position. It represents the amount of temporal and spatial movement up to the time 'space position (when a specific walking state appears).
- time or position for a given posture is minimized in the example of Fig. 7B.
- the value of 1906 becomes temporal phase information or spatial phase information, and for a 190 la walk, the value of 1906 becomes temporal phase information or spatial phase information.
- the value at 1904 of the estimated walking state 1903a (the value between the legs) becomes the temporal phase information or the spatial phase information for the walking of 1902a,
- the value in 1904 is the temporal phase information or spatial phase information.
- the time information from the upper end to the position where both feet are first crossed is time phase information.
- the reference of the spatial position is the vertical line A013 of the walking trajectory map
- the distance between the vertical line A013 on the right side of the vertical line A013 and the position closest to the vertical line A013 and the vertical line A013 is the phase information of the space .
- the method of expressing the amount of movement is not limited to the above, and it may be expressed in a relative size based on the time and amount of movement used for the sequence of one step.
- spatiotemporal phase information means phase information in which the specific walking state appears in the image, and even if the walking sequence has the same spatiotemporal cycle, it is possible to spatially and temporally put a foot on the ground or both feet. Walking with different spatiotemporal timing, such as crossing, has different values. In Fig. 5, the spatio-temporal period of walking is the same in walking sequences AO 11 and AO 12, but the spatiotemporal phase information defined above has different values.
- the spatiotemporal position information extraction unit 122 calculates spatiotemporal position information (step S705).
- the spatio-temporal position information is the time when the first specific walking state is detected and the position on the image.
- the spatio-temporal position information is information indicating the absolute position coordinates in the spatio-temporal position of the walking detection position, and the detection position of the second step or the last detection position may be used.
- the spatiotemporal period information, spatiotemporal phase information, and spatiotemporal position information obtained in step S403 are respectively the spatiotemporal period information storage unit 130, the spatiotemporal phase information storage unit 131, and the spatiotemporal phase information storage unit 131. And the spatio-temporal position information storage unit 132 (step S404).
- the image sequence input unit 100 acquires an image sequence 2 for searching for a person in the same manner as in step S401 (step S405). Then, the walking sequence extraction unit 110 extracts the walking sequence 2 from the image sequence 2 (step S406). Subsequently, the walking sequence extraction unit 110 determines whether or not there is another walking sequence as a result of the processing in step S406 (step S407). If no other walking sequence exists (No in step S407), the process ends. If it exists (Yes in step S407), the spatio-temporal period information extraction unit 120, spatio-temporal phase information extraction unit 121, and spatio-temporal position information extraction unit 122 start from the walking sequence 2 as in the case of the walking sequence 1. The spatiotemporal periodic information, spatiotemporal phase information, and spatiotemporal position information are extracted (step S408).
- the spatio-temporal period verification unit 140, the spatio-temporal phase verification unit 141, and the spatio-temporal difference extraction unit 142 each of the spatio-temporal period information stored in step S404 'spatio-temporal phase information
- the spatiotemporal position information is compared with the spatiotemporal period information, spatiotemporal phase information, and spatiotemporal position information extracted in step S408 (step S409).
- all three types of information may be used, or the verification can be performed using only one of the spatiotemporal period information or the spatiotemporal phase information.
- the spatio-temporal position information is tl, (xxl, yyl) and t3, (xx3, yy3)
- 0 t and 0 d are set to predetermined threshold values.
- the above three pieces of information may be associated with each other to make a match determination. For example, the difference between the spatio-temporal period information Dx, Dy and the spatio-temporal phase information difference Dw, Dz and the spatio-temporal position information difference D t, Dxy
- the product or sum may be compared with a standard of coincidence.
- ⁇ mul is a threshold value determined in advance. Further, based on the magnitude of the difference in the spatiotemporal position information, the reference for the difference in the spatiotemporal period information and the difference in the spatiotemporal phase information may be changed. The spatio-temporal period value and spatio-temporal phase value can change as the time and space increase, so if the spatio-temporal position information difference increases, the standard for matching the spatio-temporal period information difference and spatio-temporal phase information difference By loosening, detection omission can be reduced.
- spatio-temporal periodic information When spatio-temporal periodic information is used, the value of spatio-temporal periodic information is unlikely to change even if the orientation of the person with respect to the camera changes. Therefore, search and tracking are performed between temporally distant image sequences. The effect is easy.
- the coincidence determination unit 150 determines whether or not the walking sequence 1 and the walking sequence 2 match based on the collation result in step S409 (step S410). If they do not match (No in step S411), the process returns to step S406 to acquire a new walking sequence (step S406). If they match (Yes in step S411), the control unit 160 displays the image sequence 1 and the image sequence 2 on the display device 1060 and highlights the human region of the matched walking sequence. An example of display by the control unit 160 is shown in FIG. In Fig. 17, image sequence 1 is displayed in area 1130, image sequence 2 is displayed in area 1140, and circumscribed rectangle 1 is displayed for the human area corresponding to the matching walking sequence. 110 and 1120 are added for cooperative display.
- spatio-temporal periodic information is based on personality (eg, how to walk), personal situations (such as sudden, on, slow walking, on, etc.) and the type of footwear. (Heel height, ankle movement range, etc.), belongings (heavy weight on one side, holding force, etc.), clothes (leg movement range, etc.), road surface conditions (slipperiness, inclination, etc.) It is difficult to identify individuals because they change under the influence of various factors.
- a space-time position difference between two walking sequences to be collated using space-time position information is obtained, and by changing the matching criteria based on the space-time position difference, place, footwear, clothes' belongings, etc.
- the spatio-temporal difference is about 10 seconds as shown in FIG. 13A and FIG. 13B, the difference in spatio-temporal period information and the difference in spatio-temporal phase information are small!
- the above determination threshold values 0 X, ⁇ ⁇ , 0 w, and 0 z are multiplied by j8 (j8 is a constant less than 1.0), and a match determination is performed.
- the spatiotemporal period information obtained from the walking sequence force, the spatiotemporal phase information, and the spatiotemporal position information are detected, and on the basis of these information, the walking sequence is detected.
- the walking sequence is collated using both the spatiotemporal period information and the spatiotemporal phase information.
- only one of the spatiotemporal period information and the spatiotemporal phase information is used.
- the walking sequence can be verified, and the effects of the present invention can be obtained. By combining the two, detailed collation is possible, and the accuracy of search and tracking can be improved.
- the image sequence input to the image sequence input unit 100 has a time length or the number of frames including a walking process of at least one step or more. An improvement in accuracy can be expected. With a length of time that includes one or more steps Therefore, it is desirable to be able to detect a specific gait state that is desired to be an image sequence with a length of about 0.5 seconds or more (about 15 frames for 30 frames Z seconds) at least twice.
- Embodiment 2 of the present invention will be described with reference to FIG. 18A, FIG. 18B, and FIG.
- FIGS. 18A and 18B show images 1 and the image sequence 2, respectively. 'To track. Examples of images are shown in FIGS. 18A and 18B.
- FIG. 18A shows image sequence 1
- FIG. 18B shows image sequence 2.
- 18A and 18B a rectangular parallelepiped obstacle 900 is displayed.
- image sequence 1 shown in FIG. 18A person 922 is hidden by obstacle 900, and only person 912 is shown.
- image sequence 2 shown in FIG. 18B person 921 and person 922 and force S are shown.
- the person 912 is the force corresponding to the person 922.
- the two people in image sequence 2 are walking close to each other and have the same appearance and clothes, and whether the person corresponding to the person 912 is person 921 or person 922 is A texture motion vector is indistinguishable.
- the grid-like broken lines in FIGS. 18A and 18B are displayed by superimposing the position coordinates on the ground, and are displayed for explaining the correspondence between the position information of the two image sequences.
- the dashed grid can be obtained by measuring the correspondence between the position in the camera image and the position of the shooting location (spatiotemporal correction information) by actual measurement or geometric calculation based on the camera layout and the optical system specifications. It is done.
- the lattice in Fig. 18A and the lattice in Fig. 18B represent the corresponding positions.
- FIG. 19 is a functional block diagram showing a configuration of the person determination device 15 in the present embodiment.
- This person determination device 15 further includes a spatiotemporal correction unit 170 in addition to the configuration included in the person determination device 10 in the first embodiment.
- the spatio-temporal correction unit 170 performs correction using the spatio-temporal correction information when extracting a walking sequence or calculating spatio-temporal periodic information, spatio-temporal phase information, and spatio-temporal position information. It is a processing part which compensates for inconsistency.
- the spatiotemporal correction unit 170 is an example of a correction unit that performs a spatiotemporal correction process in extracting walking information.
- the person determination device 15 performs processing on the video of different cameras in the same procedure as in the first embodiment! ⁇ , and identifies the person in the image sequence 2 corresponding to the person in the image sequence 1 be able to. Since the arrangement in the target image is different due to the difference in camera arrangement, the grid lines are stored as spatio-temporal correction information for each camera (or for each image sequence), and the walking sequence is extracted. It is used as correction information for the spatial position coordinates. That is, the spatio-temporal correction unit 170 stores information for specifying grid lines that two-dimensionally divide the walking surface at the shooting location at a fixed distance as correction information.
- the spatiotemporal correction unit 170 performs correction processing using the spatiotemporal correction information, thereby collating spatiotemporal periodic information, spatiotemporal phase information, and spatiotemporal position information between images of different cameras. It becomes possible to do.
- the spatiotemporal correction unit 170 corrects spatiotemporal periodic information, spatiotemporal phase information, and spatiotemporal position information in a spatiotemporal manner according to the ratio of each side or area of the small region surrounded by the grid lines. (Multiply by proportionality factor).
- the spatiotemporal phase information the spatiotemporal position at which a specific walking state is first reached after crossing a specific grid line may be used. With such a configuration, a person (not shown) hidden in the obstacle 900 can be associated with the person 921, and the person 912 and the person 922 can be associated with each other.
- the image when extracting the walking sequence, the image is converted so that one image sequence matches the position of the other image sequence in the state of the force image sequence using the correspondence relationship of the positions. You can do further processing.
- image conversion by using the above-mentioned plane projection matrix H, by applying the matrix H to the pixel position on the walking plane (ground), the corresponding plane in the other image sequence (ground )upper Can be converted to a position. For this reason, it is possible to convert an image by performing the same conversion for all the pixels.
- Spatio-temporal information ⁇ Spatio-temporal phase information ⁇ Spatio-temporal position information can be calculated by performing correction using the positional relation to eliminate the effects of camera placement. .
- the spatio-temporal correction unit 170 performs frame rate conversion on one side so as to align the frame rates in advance or extracts a walking sequence. Searching for people between image sequences with different frame rates by converting the frame rate to time-space periodic information or 'time-time phase information ⁇ time information correction when calculating space-time position information' Tracking is possible. For example, compare
- the image sequence of 15 frames Z seconds is generated by thinning out frames every other frame from the latter image sequence. Detect walking sequence between sequences.
- the information spatiotemporal correction information of the frame rate required for force correction may be held in a storage device or the like in a state associated with the image sequence.
- the force used to correlate the positions between the camera images is not used if the time period or time phase is mainly used. It is possible to obtain the effect of the present invention by performing collation.
- Part or all of the processing in the above embodiment may be performed by a dedicated device.
- the processing may be performed by a CPU built in a communication device such as a terminal or a base station, or a computer executing the processing program.
- the present invention is a person determination device that determines whether or not persons included in different image sequences are the same person, and a person search and tracking device that searches and tracks a person from an image sequence. Can be used as a monitoring system etc.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Image Processing (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006531421A JP3910629B2 (ja) | 2004-08-03 | 2005-07-27 | 人物判定装置及び人物検索追跡装置 |
US11/342,651 US7397931B2 (en) | 2004-08-03 | 2006-01-31 | Human identification apparatus and human searching/tracking apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-227083 | 2004-08-03 | ||
JP2004227083 | 2004-08-03 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/342,651 Continuation US7397931B2 (en) | 2004-08-03 | 2006-01-31 | Human identification apparatus and human searching/tracking apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006013765A1 true WO2006013765A1 (ja) | 2006-02-09 |
Family
ID=35787054
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/013769 WO2006013765A1 (ja) | 2004-08-03 | 2005-07-27 | 人物判定装置及び人物検索追跡装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US7397931B2 (ja) |
JP (1) | JP3910629B2 (ja) |
CN (5) | CN101398892B (ja) |
WO (1) | WO2006013765A1 (ja) |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010033445A (ja) * | 2008-07-30 | 2010-02-12 | Fujitsu Ltd | 携帯端末装置及び認証管理方法 |
JP2011053952A (ja) * | 2009-09-02 | 2011-03-17 | Canon Inc | 画像検索装置及び画像検索方法 |
JP2012209732A (ja) * | 2011-03-29 | 2012-10-25 | Secom Co Ltd | 画像監視装置およびプログラム |
WO2013103151A1 (ja) * | 2012-01-04 | 2013-07-11 | 株式会社ニコン | 電子機器、情報生成方法、及び位置推定方法 |
JP2016057998A (ja) * | 2014-09-12 | 2016-04-21 | 株式会社日立国際電気 | 物体識別方法 |
JP2019074819A (ja) * | 2017-10-12 | 2019-05-16 | 株式会社コンピュータシステム研究所 | 監視装置、監視プログラム、記憶媒体、および、監視方法 |
JP2020077017A (ja) * | 2018-11-05 | 2020-05-21 | 公立大学法人大阪 | 歩容解析装置 |
WO2020115910A1 (ja) | 2018-12-07 | 2020-06-11 | 日本電気株式会社 | 情報処理システム、情報処理装置、情報処理方法、およびプログラム |
WO2020115890A1 (ja) | 2018-12-07 | 2020-06-11 | 日本電気株式会社 | 情報処理システム、情報処理装置、情報処理方法、およびプログラム |
WO2020136795A1 (ja) | 2018-12-27 | 2020-07-02 | 日本電気株式会社 | 情報処理装置、情報処理方法、およびプログラム |
WO2020136794A1 (ja) | 2018-12-27 | 2020-07-02 | 日本電気株式会社 | 情報処理システム、情報処理装置、情報処理方法、およびプログラム |
JPWO2022030179A1 (ja) * | 2020-08-05 | 2022-02-10 | ||
JP2022049568A (ja) * | 2020-09-16 | 2022-03-29 | 株式会社シンギュラリティテック | 人工知能による歩容認証のためのデータ前処理システム、方法、および、プログラム |
JP2022064719A (ja) * | 2020-10-14 | 2022-04-26 | 富士通クライアントコンピューティング株式会社 | 情報処理装置、情報処理システムおよび情報処理プログラム |
JPWO2022201987A1 (ja) * | 2021-03-23 | 2022-09-29 | ||
JP7193104B1 (ja) | 2021-11-25 | 2022-12-20 | 株式会社アジラ | 行動体同定システム |
US12020510B2 (en) | 2018-12-28 | 2024-06-25 | Nec Corporation | Person authentication apparatus, control method, and non-transitory storage medium |
US12136289B2 (en) | 2023-07-24 | 2024-11-05 | Nec Corporation | Information processing apparatus, information processing method, and non-transitory computer-readable storage medium |
Families Citing this family (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006013765A1 (ja) * | 2004-08-03 | 2006-02-09 | Matsushita Electric Industrial Co., Ltd. | 人物判定装置及び人物検索追跡装置 |
US8131022B2 (en) * | 2004-08-31 | 2012-03-06 | Panasonic Corporation | Surveillance recorder and its method |
JP4290164B2 (ja) * | 2006-01-31 | 2009-07-01 | キヤノン株式会社 | 識別領域を示す表示を画像と共に表示させる表示方法、コンピュータ装置に実行させるプログラム、および、撮像装置 |
JP4665933B2 (ja) * | 2006-07-04 | 2011-04-06 | セイコーエプソン株式会社 | 文書編集支援装置、プログラムおよび記憶媒体 |
JP4257615B2 (ja) * | 2006-07-14 | 2009-04-22 | ソニー株式会社 | 画像処理装置および方法、並びにプログラム |
JP4263737B2 (ja) * | 2006-11-09 | 2009-05-13 | トヨタ自動車株式会社 | 歩行者検知装置 |
JP4337867B2 (ja) * | 2006-12-01 | 2009-09-30 | セイコーエプソン株式会社 | 文書編集支援装置、文書編集装置、プログラムおよび記憶媒体 |
JP4315991B2 (ja) * | 2007-04-20 | 2009-08-19 | 本田技研工業株式会社 | 車両周辺監視装置、車両周辺監視方法、車両周辺監視プログラム |
US8432449B2 (en) * | 2007-08-13 | 2013-04-30 | Fuji Xerox Co., Ltd. | Hidden markov model for camera handoff |
JP2009053815A (ja) * | 2007-08-24 | 2009-03-12 | Nikon Corp | 被写体追跡プログラム、および被写体追跡装置 |
JP4921335B2 (ja) * | 2007-12-10 | 2012-04-25 | キヤノン株式会社 | ドキュメント処理装置及び検索方法 |
CN101350064B (zh) * | 2008-08-29 | 2012-06-13 | 北京中星微电子有限公司 | 二维人体姿态估计方法及装置 |
CN101388114B (zh) * | 2008-09-03 | 2011-11-23 | 北京中星微电子有限公司 | 一种人体姿态估计的方法和系统 |
JP4760918B2 (ja) * | 2009-01-23 | 2011-08-31 | カシオ計算機株式会社 | 撮像装置、被写体追従方法、及びプログラム |
JP5029647B2 (ja) * | 2009-04-08 | 2012-09-19 | 株式会社ニコン | 被写体追尾装置、およびカメラ |
US11080513B2 (en) * | 2011-01-12 | 2021-08-03 | Gary S. Shuster | Video and still image data alteration to enhance privacy |
CN102999152B (zh) * | 2011-09-09 | 2016-06-29 | 康佳集团股份有限公司 | 一种手势动作识别方法和系统 |
US9092675B2 (en) | 2012-03-29 | 2015-07-28 | The Nielsen Company (Us), Llc | Methods and apparatus to count people in images |
US9275285B2 (en) | 2012-03-29 | 2016-03-01 | The Nielsen Company (Us), Llc | Methods and apparatus to count people in images |
US8761442B2 (en) * | 2012-03-29 | 2014-06-24 | The Nielsen Company (Us), Llc | Methods and apparatus to count people in images |
US8660307B2 (en) | 2012-03-29 | 2014-02-25 | The Nielsen Company (Us), Llc | Methods and apparatus to count people in images |
JP5912062B2 (ja) * | 2012-05-24 | 2016-04-27 | オリンパス株式会社 | 撮影機器及び動画像データの記録方法 |
KR102120864B1 (ko) * | 2013-11-06 | 2020-06-10 | 삼성전자주식회사 | 영상 처리 방법 및 장치 |
CN104639517B (zh) * | 2013-11-15 | 2019-09-17 | 阿里巴巴集团控股有限公司 | 利用人体生物特征进行身份验证的方法和装置 |
US10026019B2 (en) * | 2014-03-11 | 2018-07-17 | Mitsubishi Electric Corporation | Person detecting device and person detecting method |
KR102015588B1 (ko) * | 2015-07-16 | 2019-08-28 | 한화테크윈 주식회사 | 배회 경보 방법 및 장치 |
KR101732981B1 (ko) * | 2015-10-29 | 2017-05-08 | 삼성에스디에스 주식회사 | 개인화 특성 분석 시스템 및 방법 |
US9911198B2 (en) | 2015-12-17 | 2018-03-06 | Canon Kabushiki Kaisha | Method, system and apparatus for matching moving targets between camera views |
JP6879296B2 (ja) | 2016-03-31 | 2021-06-02 | 日本電気株式会社 | 画像検出装置、画像検出方法、及びプログラム |
CN109477951B (zh) * | 2016-08-02 | 2021-10-22 | 阿特拉斯5D公司 | 在保护隐私的同时识别人及/或识别并量化疼痛、疲劳、情绪及意图的系统及方法 |
JP6888950B2 (ja) * | 2016-12-16 | 2021-06-18 | フォルシアクラリオン・エレクトロニクス株式会社 | 画像処理装置、外界認識装置 |
JP6800820B2 (ja) * | 2017-07-14 | 2020-12-16 | パナソニック株式会社 | 人流分析方法、人流分析装置、及び人流分析システム |
CN107730686A (zh) * | 2017-11-01 | 2018-02-23 | 桐乡守敬应用技术研究院有限公司 | 一种生物特征解锁方法 |
TWI697914B (zh) * | 2018-11-29 | 2020-07-01 | 宏碁股份有限公司 | 監測系統及其方法 |
CN109859322B (zh) * | 2019-01-22 | 2022-12-06 | 广西大学 | 一种基于变形图的谱姿态迁移方法 |
JP7218820B2 (ja) * | 2019-12-25 | 2023-02-07 | 日本電気株式会社 | 推定装置、推定システム、推定方法、およびプログラム |
JP7198196B2 (ja) * | 2019-12-26 | 2022-12-28 | 株式会社日立ハイテク | 計測装置及び計測方法 |
US11315363B2 (en) * | 2020-01-22 | 2022-04-26 | Board Of Trustees Of Michigan State University | Systems and methods for gait recognition via disentangled representation learning |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0433066A (ja) * | 1990-05-24 | 1992-02-04 | Nippon Telegr & Teleph Corp <Ntt> | 個人識別装置 |
JP2000182060A (ja) * | 1998-12-21 | 2000-06-30 | Nec Corp | 個人識別装置及び個人識別方法 |
JP2003346159A (ja) * | 2002-05-28 | 2003-12-05 | Oki Electric Ind Co Ltd | 人物追跡方法及び人物追跡装置 |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6205231B1 (en) * | 1995-05-10 | 2001-03-20 | Identive Corporation | Object identification in a moving video image |
US5885229A (en) * | 1995-07-19 | 1999-03-23 | Nippon Telegraph & Telephone Corp. | Walking pattern processing method and system for embodying the same |
US6263088B1 (en) * | 1997-06-19 | 2001-07-17 | Ncr Corporation | System and method for tracking movement of objects in a scene |
US7116323B2 (en) * | 1998-05-27 | 2006-10-03 | In-Three, Inc. | Method of hidden surface reconstruction for creating accurate three-dimensional images converted from two-dimensional images |
US6542621B1 (en) * | 1998-08-31 | 2003-04-01 | Texas Instruments Incorporated | Method of dealing with occlusion when tracking multiple objects and people in video sequences |
JP2002197437A (ja) * | 2000-12-27 | 2002-07-12 | Sony Corp | 歩行検出システム、歩行検出装置、デバイス、歩行検出方法 |
US6525663B2 (en) * | 2001-03-15 | 2003-02-25 | Koninklijke Philips Electronics N.V. | Automatic system for monitoring persons entering and leaving changing room |
US20030123703A1 (en) * | 2001-06-29 | 2003-07-03 | Honeywell International Inc. | Method for monitoring a moving object and system regarding same |
US7035431B2 (en) * | 2002-02-22 | 2006-04-25 | Microsoft Corporation | System and method for probabilistic exemplar-based pattern tracking |
TW582168B (en) * | 2002-03-01 | 2004-04-01 | Huper Lab Co Ltd | Method for abstracting multiple moving objects |
JP4187448B2 (ja) * | 2002-03-07 | 2008-11-26 | 富士通マイクロエレクトロニクス株式会社 | 画像における移動体追跡方法及び装置 |
US7113185B2 (en) * | 2002-11-14 | 2006-09-26 | Microsoft Corporation | System and method for automatically learning flexible sprites in video layers |
JP3947973B2 (ja) * | 2003-02-14 | 2007-07-25 | ソニー株式会社 | 画像処理装置および方法、プログラム、並びに記録媒体 |
WO2006013765A1 (ja) * | 2004-08-03 | 2006-02-09 | Matsushita Electric Industrial Co., Ltd. | 人物判定装置及び人物検索追跡装置 |
JP2007241500A (ja) * | 2006-03-07 | 2007-09-20 | Toshiba Corp | 顔認証装置および顔認証方法 |
-
2005
- 2005-07-27 WO PCT/JP2005/013769 patent/WO2006013765A1/ja active Application Filing
- 2005-07-27 JP JP2006531421A patent/JP3910629B2/ja not_active Expired - Fee Related
- 2005-07-27 CN CN2008102100009A patent/CN101398892B/zh not_active Expired - Fee Related
- 2005-07-27 CN CN2008101459917A patent/CN101398890B/zh not_active Expired - Fee Related
- 2005-07-27 CN CN2008102100013A patent/CN101344923B/zh not_active Expired - Fee Related
- 2005-07-27 CN CNB2005800008924A patent/CN100474339C/zh not_active Expired - Fee Related
- 2005-07-27 CN CN2008101459921A patent/CN101398891B/zh not_active Expired - Fee Related
-
2006
- 2006-01-31 US US11/342,651 patent/US7397931B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0433066A (ja) * | 1990-05-24 | 1992-02-04 | Nippon Telegr & Teleph Corp <Ntt> | 個人識別装置 |
JP2000182060A (ja) * | 1998-12-21 | 2000-06-30 | Nec Corp | 個人識別装置及び個人識別方法 |
JP2003346159A (ja) * | 2002-05-28 | 2003-12-05 | Oki Electric Ind Co Ltd | 人物追跡方法及び人物追跡装置 |
Non-Patent Citations (1)
Title |
---|
NIYOGI A. ET AL.: "Analyzing and Recognizing Walking Figures in XYT.M.I.T. Media Lab Vision and Modeling Group.", TECHNICAL REPORT., no. 223, 1994, XP002904545 * |
Cited By (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010033445A (ja) * | 2008-07-30 | 2010-02-12 | Fujitsu Ltd | 携帯端末装置及び認証管理方法 |
JP2011053952A (ja) * | 2009-09-02 | 2011-03-17 | Canon Inc | 画像検索装置及び画像検索方法 |
JP2012209732A (ja) * | 2011-03-29 | 2012-10-25 | Secom Co Ltd | 画像監視装置およびプログラム |
WO2013103151A1 (ja) * | 2012-01-04 | 2013-07-11 | 株式会社ニコン | 電子機器、情報生成方法、及び位置推定方法 |
JP2016057998A (ja) * | 2014-09-12 | 2016-04-21 | 株式会社日立国際電気 | 物体識別方法 |
JP2019074819A (ja) * | 2017-10-12 | 2019-05-16 | 株式会社コンピュータシステム研究所 | 監視装置、監視プログラム、記憶媒体、および、監視方法 |
JP7325745B2 (ja) | 2017-10-12 | 2023-08-15 | 株式会社コンピュータシステム研究所 | 監視装置、監視プログラム、記憶媒体、および、監視方法 |
JP2020077017A (ja) * | 2018-11-05 | 2020-05-21 | 公立大学法人大阪 | 歩容解析装置 |
JP7182778B2 (ja) | 2018-11-05 | 2022-12-05 | 公立大学法人大阪 | 歩容解析装置 |
WO2020115890A1 (ja) | 2018-12-07 | 2020-06-11 | 日本電気株式会社 | 情報処理システム、情報処理装置、情報処理方法、およびプログラム |
US12087075B2 (en) | 2018-12-07 | 2024-09-10 | Nec Corporation | Information processing apparatus, information processing method, and non-transitory computer-readable storage medium |
US12087076B2 (en) | 2018-12-07 | 2024-09-10 | Nec Corporation | Information processing apparatus, information processing method, and non-transitory computer-readable storage medium |
US12046075B2 (en) | 2018-12-07 | 2024-07-23 | Nec Corporation | Information processing apparatus, information processing method, and program |
US12112565B2 (en) | 2018-12-07 | 2024-10-08 | Nec Corporation | Information processing apparatus, information processing method, and non-transitory computer-readable storage medium |
US11847848B2 (en) | 2018-12-07 | 2023-12-19 | Nec Corporation | Information processing apparatus, information processing method, and program |
WO2020115910A1 (ja) | 2018-12-07 | 2020-06-11 | 日本電気株式会社 | 情報処理システム、情報処理装置、情報処理方法、およびプログラム |
US11971952B2 (en) | 2018-12-27 | 2024-04-30 | Nec Corporation | Information processing apparatus, information processing method, and program |
US11928181B2 (en) | 2018-12-27 | 2024-03-12 | Nec Corporation | Information processing apparatus, information processing method, and program |
WO2020136795A1 (ja) | 2018-12-27 | 2020-07-02 | 日本電気株式会社 | 情報処理装置、情報処理方法、およびプログラム |
WO2020136794A1 (ja) | 2018-12-27 | 2020-07-02 | 日本電気株式会社 | 情報処理システム、情報処理装置、情報処理方法、およびプログラム |
US11934483B2 (en) | 2018-12-27 | 2024-03-19 | Nec Corporation | Information processing apparatus, information processing method, and program |
US11704933B2 (en) | 2018-12-27 | 2023-07-18 | Nec Corporation | Information processing apparatus, information processing method, and program |
US11704934B2 (en) | 2018-12-27 | 2023-07-18 | Nec Corporation | Information processing apparatus, information processing method, and program |
US11710347B2 (en) | 2018-12-27 | 2023-07-25 | Nec Corporation | Information processing apparatus, information processing method, and program |
US11361588B2 (en) | 2018-12-27 | 2022-06-14 | Nec Corporation | Information processing apparatus, information processing method, and program |
US12020510B2 (en) | 2018-12-28 | 2024-06-25 | Nec Corporation | Person authentication apparatus, control method, and non-transitory storage medium |
WO2022030179A1 (ja) * | 2020-08-05 | 2022-02-10 | 国立大学法人大阪大学 | 周期画像復元装置及び方法、識別装置及び方法、検証装置及び方法、特徴抽出装置、訓練方法、位相推定装置、並びに記憶媒体 |
JP7353686B2 (ja) | 2020-08-05 | 2023-10-02 | 国立大学法人大阪大学 | 周期画像復元装置及び方法、識別装置及び方法、検証装置及び方法、特徴抽出装置、訓練方法、位相推定装置、並びに記憶媒体 |
JPWO2022030179A1 (ja) * | 2020-08-05 | 2022-02-10 | ||
JP7296538B2 (ja) | 2020-09-16 | 2023-06-23 | 株式会社シンギュラリティテック | 人工知能による歩容認証のためのデータ前処理システム、方法、および、プログラム |
JP2022049568A (ja) * | 2020-09-16 | 2022-03-29 | 株式会社シンギュラリティテック | 人工知能による歩容認証のためのデータ前処理システム、方法、および、プログラム |
JP2022064719A (ja) * | 2020-10-14 | 2022-04-26 | 富士通クライアントコンピューティング株式会社 | 情報処理装置、情報処理システムおよび情報処理プログラム |
JP7525055B2 (ja) | 2021-03-23 | 2024-07-30 | 日本電気株式会社 | 画像解析装置、画像解析方法及びプログラム |
WO2022201987A1 (ja) * | 2021-03-23 | 2022-09-29 | 日本電気株式会社 | 画像解析装置、画像解析システム、画像解析方法及びプログラム |
JPWO2022201987A1 (ja) * | 2021-03-23 | 2022-09-29 | ||
JP2023078062A (ja) * | 2021-11-25 | 2023-06-06 | 株式会社アジラ | 行動体同定システム |
JP7193104B1 (ja) | 2021-11-25 | 2022-12-20 | 株式会社アジラ | 行動体同定システム |
US12141232B2 (en) | 2023-07-14 | 2024-11-12 | Nec Corporation | Information processing apparatus, information processing method, and non-transitory computer-readable storage medium |
US12141231B2 (en) | 2023-07-14 | 2024-11-12 | Nec Corporation | Information processing apparatus, information processing method, and non-transitory computer-readable storage medium |
US12136289B2 (en) | 2023-07-24 | 2024-11-05 | Nec Corporation | Information processing apparatus, information processing method, and non-transitory computer-readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN101398891A (zh) | 2009-04-01 |
CN1842824A (zh) | 2006-10-04 |
CN101398892B (zh) | 2010-12-22 |
US20060120564A1 (en) | 2006-06-08 |
JP3910629B2 (ja) | 2007-04-25 |
CN101398890A (zh) | 2009-04-01 |
JPWO2006013765A1 (ja) | 2008-05-01 |
CN101398891B (zh) | 2010-12-08 |
CN101398890B (zh) | 2010-12-08 |
CN101344923B (zh) | 2012-05-23 |
CN101344923A (zh) | 2009-01-14 |
CN101398892A (zh) | 2009-04-01 |
CN100474339C (zh) | 2009-04-01 |
US7397931B2 (en) | 2008-07-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2006013765A1 (ja) | 人物判定装置及び人物検索追跡装置 | |
US9384563B2 (en) | Image processing device, method, and computer program product | |
KR101118654B1 (ko) | 모션캡쳐 기반의 자세분석을 통한 재활 장치 및 이에 따른 재활 방법 | |
US9275276B2 (en) | Posture estimation device and posture estimation method | |
JP6847254B2 (ja) | 歩行者追跡の方法および電子デバイス | |
JP5148669B2 (ja) | 位置検出装置、位置検出方法、及び位置検出プログラム | |
Ning et al. | People tracking based on motion model and motion constraints with automatic initialization | |
CN108875507B (zh) | 行人跟踪方法、设备、系统和计算机可读存储介质 | |
CN114556268A (zh) | 一种姿势识别方法及装置、存储介质 | |
JP2015052999A (ja) | 個人特徴抽出プログラム、個人特徴抽出装置、および個人特徴抽出方法 | |
EP3438601A1 (en) | Measurement device, measurement method, and computer readable recording medium | |
US20200226787A1 (en) | Information processing apparatus, information processing method, and program | |
JPH0792369B2 (ja) | 画像計測装置 | |
CN115527265A (zh) | 一种基于体育训练的动作捕捉方法及系统 | |
JP7287686B2 (ja) | 関節位置取得装置及び方法 | |
KR20140123399A (ko) | 사용자 영상의 신체 부위를 검출하는 장치 및 방법 | |
JP5246946B2 (ja) | 全身領域推定装置 | |
JP6944144B2 (ja) | スイング解析装置、方法及びプログラム | |
JP7554245B2 (ja) | 人物追跡方法及び人物追跡装置 | |
US20240119087A1 (en) | Image processing apparatus, image processing method, and non-transitory storage medium | |
US20240119620A1 (en) | Posture estimation apparatus, posture estimation method, and computer-readable recording medium | |
Akhavizadegan et al. | REAL-TIME AUTOMATED CONTOUR BASED MOTION TRACKING USING A SINGLE-CAMERA FOR UPPER LIMB ANGULAR MOTION MEASUREMENT | |
WO2021130913A1 (ja) | 相対位置検出装置、相対位置検出方法、及び相対位置検出プログラム | |
JP2023046566A (ja) | 選定プログラム、選定方法および情報処理装置 | |
CN116563576A (zh) | 姿态比对方法、估计方法、装置及电子设备 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200580000892.4 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006531421 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11342651 Country of ref document: US |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWP | Wipo information: published in national office |
Ref document number: 11342651 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |