Nothing Special   »   [go: up one dir, main page]

EP3080752A1 - Camera-based tracking system for the determination of physical, physiological and/or biometric data and/or for risk assessment - Google Patents

Camera-based tracking system for the determination of physical, physiological and/or biometric data and/or for risk assessment

Info

Publication number
EP3080752A1
EP3080752A1 EP14825281.0A EP14825281A EP3080752A1 EP 3080752 A1 EP3080752 A1 EP 3080752A1 EP 14825281 A EP14825281 A EP 14825281A EP 3080752 A1 EP3080752 A1 EP 3080752A1
Authority
EP
European Patent Office
Prior art keywords
tracking system
entity
camera
safety
data processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP14825281.0A
Other languages
German (de)
French (fr)
Inventor
Cédric BORNAND
Thomas Dreher
Gregory MEDWED
Xavier VEUTHEY
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Viacam Sarl
Original Assignee
Viacam Sarl
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Viacam Sarl filed Critical Viacam Sarl
Priority to EP14825281.0A priority Critical patent/EP3080752A1/en
Publication of EP3080752A1 publication Critical patent/EP3080752A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training

Definitions

  • the present invention generally relates to image generation and analysis technology.
  • the invention relates to the generation of data or parameters related to tracking, and determination of physical, physiological and/or biometric properties using cameras and data processing units. More specifically, the invention concerns a tracking system, uses of the tracking system and methods as defined.
  • US 2013/0188031 discloses a risk recognition system based on human identification. A person is photographed and identified on the basis of photograph data available with respect to the person in a database. By tracking the movement path of the identified human, the system is capable of identifying the possibility of a dangerous situation.
  • US 2008/0129825 discloses an autonomous picture production system, comprising an object tracing device, a motorized camera and a camera control device. Each traced object carries a location unit transmitting a signal allowing the determination of the position of the object. This system may be used to produce images of an object or a person during a sports event, for example, in an automated manner.
  • US 2011/0208444 discloses a system for measuring balance and track motion in mammals.
  • the system comprises a band configured for attachment to a body part of a mammal for sensing, for example, the circumferential pressure at the body part where the band is attached.
  • a living individual such as a human or animal
  • biometric data and/or data concerning the status for example physiological status of the individual, such as the respiratory rate, the heart rate, transpiration from a body, body temperature, and constraints, forces, tension and/or extension experienced by the body.
  • It is an objective to determine these parameters preferably without physically contacting the moving entity, for example without equipping the moving entity with a sensor, sender, or other data generating and/or transmitting unit having additional weight and requiring attachment to the moving entity. Attachment of a data generating unit on a moving entity may in some way influence the moving entity, which the present invention aims to prevent. It is an objective of the invention to produce relevant data and parameters as specified from image-related data.
  • a further objective of the invention is to assess the risk of accident or injury of a moving entity, for example in traffic or during a sports event. It is in particular an objective to provide the possibility of producing a warning or intervening in case there is a risk of accident and/or injury, or any type of damage in general. For example, in case of equestrian sports, it is an objective to monitor the physiological status of a horse in order to determine if there is a risk of health damage, injury, or accident, for example. Furthermore, it is an objective of the invention to provide data related to participants of a sports event. For example, it is an objective to display data related to the data and parameters specified above, for example to render the sports event more interesting or to have further insight.
  • the present invention provides a tracking system comprising one or more cameras adapted to take images, one or more data processing units, said tracking system further comprising one or more output or display units, wherein said one or more camera is adapted and/or positioned to observe a geographical area of interest, and wherein said one or more data processing unit is arranged to detect an entity of interest in the area of interest.
  • the entity of interest may be one or more selected from any object of interest, a person, and an animal.
  • the entity may be moving.
  • the tracking system is configured to identify a reference zone on at least some of said images, wherein said reference zone is on or associated with said entity in said image.
  • At least one camera of the tracking system is an infrared camera and the data processing unit is configured to identify in the images an individual, such as a person or an animal, and to determine from the images taken by the camera the individual's respiratory rate.
  • At least one camera of the tracking system is a hyperspectral and/or multispectral camera and the data processing unit is configured to identify in the images an individual, such as a person or an animal, and to determine from the images taken by the camera the individual's heartbeat rate.
  • the data processing unit of the tracking system is configured to determine, from images taken from an individual, the individual's transpiration.
  • the invention provides the use of an infrared camera for determining the respiratory rate of an individual. In an aspect, the invention provides the use of a succession of images taken by an infrared camera for determining the respiratory rate of an individual.
  • the invention provides a method for determining the respiratory rate of an individual, the method comprising the step of determining said rate by analysing a succession of images generated by an infrared camera.
  • the invention provides the use of a hyperspectral and/or multispectral camera for determining the heart rate of an individual. In an aspect, the invention provides the use of a succession of images produced by a hyperspectral and/or multispectral camera for determining the heart rate of an individual.
  • the invention provides a method for determining the heart rate and/or pulse of an individual, the method comprising the step of determining said rate by analysing a succession of images generated by a hyperspectral and/or multispectral camera.
  • the tracking system of the invention comprises a hyperspectral and/or multispectral camera 3, wherein said one or more data processing unit 5, and in particular said image analysis unit 6 is capable of detecting, on a reference zone 14 on the skin of said individual, a change of light, and in particular light intensity within a given wavelength or wavelength range, and wherein said data processing unit 5 and/or said image analysis unit 7 is adapted to determine, from said light intensity, the event of a pulse beat and/or heart rate of said individual.
  • the invention provides the use of one or more camera and/or one or more images produced by the camera for determining a parameter related to transpiration of an individual.
  • the invention provides the use of one or more images for determining a parameter related to transpiration of an individual, the method comprising the step of determining transpiration from the reflection of light from the skin of the individual.
  • the tracking system of the invention may comprise safety rules associated with one or more of said physiological parameters, such as respiration, heart rate, transpiration, for example.
  • the system may thus determine an undesired situation as defined elsewhere in this specification if threshold values with respect to one or more of these parameters are met, trespassed and/or undercut, for example.
  • the invention provides the use of the tracking system of the invention for monitoring, tracing, tracking and/or displaying one or more selected from the group of: a professional activity, a security and/or safety training event and/or security and/or safety exercise, a sports event, and a military training event.
  • the invention provides the use of the tracking system of the invention for monitoring, tracing, tracking and/or displaying the scene of a sports event, of a professional activity, of a training event, such as a security or safety exercise or training event, and of a military training event.
  • the invention provides the use of the tracking system of the invention for monitoring, tracing, tracking and/or displaying a sportsperson, for example at a sports and/or training event.
  • the invention provides the use of the tracking system of the invention for assessing the risk of an accident involving, for example one or more individuals and/or vehicles, and/or of bodily harm of an individual, for example a sportsperson, for example during training and/or competition.
  • the invention provides the use of the tracking system for detecting one or more selected from the group of: (1) a deviation from safety rules, (2) an increased risk of accident, and, (3) other potentially harmful events.
  • the invention provides the use of the tracking system for reducing the occurrence of accidents in professional environments and/or for increasing the safety in professional environments.
  • the invention provides the use of the tracking system of the invention for displaying parameters determined by the tracking system of the invention. Preferably, said parameters are displayed in real time.
  • the invention provides a method for monitoring, tracing and/or tracking an object and/or individual, the method comprising the step of providing the tracking system of the invention and monitoring, tracing and/or tracking an object and/or individual with said system.
  • the invention provides a method for collecting information and/or determining one or more parameters of an object and/or individual during an event selected from the group of: professional activity, a professional and/or security or safety training event, a sports event and/or training event in general.
  • the method comprises the steps of: producing an image and/or a sequence of images of said object and/or individual; computing, from said image(s), said information and/or parameter(s).
  • Such parameters are, for example, physical parameters, such as the trajectory, the direction of movement, the position, speed, number steps used for a certain distance, or physiological parameters such as respiratory rate, heartbeat rate, cardiac rhythm, temperature, respiration, respiratory rate, transpiration, and galvanic properties of the skin of one or more entities.
  • the data obtained in this manner can be combined with data retrieved from locally installed microsystems that are connected to a central installation via wireless connection networks, such as optical or radio wave based means.
  • the tracking system of the invention which is capable of measuring physical, physiological, and/or biometric parameters.
  • Figure 1 schematically represents a tracking system in accordance with an embodiment of the invention, using two video cameras, one sensitive to infrared light allowing quick and reliable recognition of a human horse rider, and the other camera sensitive to visible light, allowing a detailed analysis of the rider or his action.
  • Figure 2 schematically shows in greater detail a data processing unit of an embodiment of the invention.
  • Figures 3a and 3b schematically represent a tracking system comprising a plurality of cameras, which have different fields of view, in order to increase the monitored area (Figure 3a) or in case of zones covered by several cameras, for increasing the degree or level of observability and/or to create different angles and/or viewpoints (Figure 3b).
  • FIG. 4 schematically shows a tracking system in accordance with an embodiment of the invention, the system comprising a hyperspectral camera the spectral resolution of which in the range from 700 to HOOnm is at least 20nm.
  • This type of camera in combination with an appropriate data processing unit, is capable of distinguishing the respiration rate of a living subject.
  • Figure 5 schematically illustrates an image acquisition system in accordance with an embodiment of the invention, comprising a tracking system adapted to follow the trajectory of a horse and to determine information regarding the number of steps/strides and/or length of steps/strides. The information is displayed on a screen.
  • Figure 6 schematically illustrates the generation of data and its storage and display in accordance with an embodiment of the invention.
  • several moving objects individual are identified, so that the acquired parameters can be used to make comparisons between the different individuals.
  • Figure 7 illustrates a system for tracking motorized forklift trucks in accordance with an embodiment of the invention.
  • the forklift is equipped with a tag that can be easily identified by the image treatment/analysis system.
  • the analysis of the images generates data that can be compared to rules and which can thereafter be stored in an encrypted database that can be consulted by a user for statistical purposes. If data protection conditions allow, the user may get access to a specific image associated with the result of the statistical data analysis.
  • Figures 8 A to A C shows images taken with an infrared camera, allowing detecting if individuals wear safety glasses.
  • Figures 9 A to 9 C are technical drawings corresponding to the images shown in Figs 8 A- 8 C.
  • Figure 10 shows an image taken by a visible light (RGB) camera of a tracking system of the invention configured to detect whether human operators carry hardhats or not.
  • RGB visible light
  • Figure 11 is a technical drawing corresponding to the image of Fig. 10.
  • Figure 12 A and 12 B show thermal pictures generated processed and/or analysed in accordance with an embodiment of the invention.
  • the event of exhalation can be recognized on Figure 12 A and can be seen to be absent in Figure 12 B.
  • the data processing system can determine the respiratory rate of an individual.
  • Figure 13 A shows stacked image-data captured by a hyperspectral camera over a period of time, in accordance with an embodiment of the invention.
  • the image data is captured in the form of a line shown horizontally in the figure, with the first image/line captured being represented on top of the figure.
  • Figure 13 B shows the outcome of the analysis of the image-date generated in Figure 13 A. From changes of light intensity, repeating patterns become apparent, which the data processing unit, in particular an analysing unit, can use to determine the heart rate of the individual from which the image date was created.
  • the data processing unit in particular an analysing unit
  • Figure 14 is a photograph taken with an RGB (visual light) camera, calibrated in accordance with an embodiment of the invention, so as to contain a grid or coordinate system representing the position on the ground.
  • RGB visual light
  • Figure 15 is a device allowing rapid geometric correction and/or calibration of a camera used in the system of an embodiment of the invention, the calibration device comprising light sources connected to each other via flexible connections having a defined distance.
  • Figure 16 shows an image captured by an infrared camera, allowing rapid detection of a target entity, which may be used for assisting the localisation algorithms analysing an RGB image.
  • FIG. 1 shows an embodiment of a tracking system 1 of the present invention.
  • the system comprises one or more cameras 2, 3. There is a first camera that is sensitive to visible light 2, and a second camera, which is an infrared camera 3.
  • the cameras may take still and/or single images, but preferably one or both of the cameras are adapted to take successive images, in particular videos and/or films.
  • the system further comprises one or more data processing units 5, which is/are capable of analyzing the images as will be described further below.
  • the data processing unit preferably is an image data processing unit.
  • the system of the invention further comprises one or more output or display units 10, 11. In some embodiments, the output unit is used for displaying the parameters and data determined by the data processing unit(s) 5.
  • the output unit may be for taking safety measures, such as producing warnings with respect to the occurrence of a risk as determined by the tracking system of the invention.
  • the warning may be a sound-related warning, in which case the output unit preferably comprises a loudspeaker.
  • the warning may also be displayed on a screen or transmitted via an electronic message, such as an email or sms.
  • Other safety measures are disclosed elsewhere in this specification.
  • the tracking system is configured to analyze at least one physical, physiological and/or biometric parameter associated with said entity 13.
  • the cameras 2 and/or 3 are adapted to monitor a determined geographical area of interest 12. Events taking place in the area of interest 12 are the subject of measurements and parameter determination in accordance with the invention.
  • the area of interest 12 may be any area for which data as reported herein may be of interest.
  • the area of interest 12 may be a working area and/or an area where professional activity is taking place.
  • the area of interest 12 may also be an area where a sports activity takes place, for example a tournament and/or a competition.
  • the system of the invention is preferably adapted to produce data with respect to an entity 13 moving within the area of interest.
  • the system of the invention may be used for monitoring professional activities, training and/or security or safety events or exercises, military training events and/or a sports event.
  • Security and safety training events include, for example, training events of the fire brigade, police, ambulance, outpatient departments, and/or training events of security staff.
  • the tracking system of the invention may be used to monitor an exercise of the fire brigade.
  • the area of interest is preferably selected so as to cover the entire training event and/or a relevant part of the event.
  • the sports event is selected from a competition, for example a equitation event, for example a horse race and/or a showjumping event, a team sport event, such as a football match, a basketball match, a handball match, a baseball match, an American football match, a rugby match, or a racket sports event, such as a tennis match, a badminton match, for example.
  • the entity 13 may be an object, for example a vehicle, or may be a living individual, for example a human or animal subject.
  • the moving entity 13 is a horse ridden by a horse rider.
  • the accessory object 18 may function as a reference object or point, as described elsewhere in this specification.
  • the tracking system 1 is adapted and/or configured to generate one or more physical, physiological and/or biometric parameter of the entity 13.
  • physiological and/or biometric parameters are assessed only with respect to living entities, in particular a human or animal having the respective parameter.
  • certain physical parameters determined can only be determined with respect to moving entities, such as speed, direction of movement, and the trajectory, for example.
  • the tracking system of the invention is configured to determine a trajectory 16 of said moving object or individual in said geographical area of interest 12, for example. These latter parameters may be determined with respect to living entities as well as moving (nonliving) objects.
  • physiological parameters are respiration, heart rate, transpiration, and body temperature, for example.
  • the physiological parameter is a health parameter and/or is associated with the health or health status of an individual. For this reason, the physiological parameter can be used for establishing safety rules.
  • biometric parameters are parameters related to the surface of the entity. For example, in case of humans or animal entities, skin color, hair, such as hair color, and color and/or paint.
  • the camera 2, 3 is preferably selected so as to be useful in the image analysis as foreseen in the system of the invention.
  • Figure 2 shows in greater detail the one or more data processing unit 5.
  • the one or more data processing unit may be in the form of one or more computers.
  • the word “data processing unit” includes the plural form.
  • the data processing unit 5 as shown in Figure 2 comprises a detection unit 6, which is capable of detecting the presence of a moving entity of interest 13 in the images captured by camera 2. It is noted that the expressions "capable of” and “is configured to”, in the context of the data processing unit, includes and/or refers to "is programmed to" and/or “contains software that is capable of" and/or “runs algorithms capable of”.
  • the detection unit 6 When detecting a moving entity of interest, the detection unit 6 is preferably capable of distinguishing moving entities that are not of interest from those of interest, for example a movement of a spectator or an umpire from the movement of a horse to be tracked.
  • the tracking unit 7 is activated.
  • the tracking unit 7 tracks the moving entity 13 on the successive images of the film. In so doing, the moving entity is also identified, as identification takes place along with tracking. Accordingly, part of the tracking unit 7 may comprise an identification unit 8.
  • the identification unit is configured to identify an entity 13. For example, the entity 13 is identified as a human operator or a particular vehicle and the like.
  • said one or more data processing units comprise a detection unit 6, adapted to detect said entity 13 within said image 4, a tracking unit 7 adapted to track the detected entity 13 on the successive images taken by the camera 2, 3, and an analyzing unit 9, adapted to determine a parameter related to said object or individual 13 tracked by tracking system.
  • the tracking system of the invention is configured to identify said unit 13 by methods and/or analysis selected from the group of: ballistic analysis (speed, movement pattern), by form recognition, by identification of a visual or magnetic tag (19), by color analysis, and by analysis of the surface structure.
  • the unit 13 may be recognized by a combination comprising two or more of said analysis and/or methods. The identification is preferably accomplished by said identification unit 8.
  • the cameras 2, 3 of the system of the invention are preferably selected in accordance with the algorithms used for image analysis and/or which allows for identification of the unit 13 or of the compliance with safety rules while simplifying the complexity of the Wgorithms
  • the system of the invention one camera selected from the group of: visible light cameras; cameras that are sensitive to infrared light and/or that captures infrared light, such as cameras selected from infrared (IR), near infrared (NIR) cameras and thermal cameras; time-of-flight cameras; short bandwidth cameras, UV cameras, and other cameras as specified elsewhere in this specification.
  • the invention encompasses the use of two or more different or identical cameras.
  • the tracking system comprises a camera that is capable of capturing ultraviolet (UV) light.
  • the identification unit 8 is capable, for example, of recognizing a tag 19 that may be provided on the moving entity 13 in order to facilitate identification of the moving entity.
  • the identification unit is also capable of identifying moving entities, for example persons, and to identify objects, such as safety equipment, for example.
  • the analyzing unit 9 contains algorithms and computer programs and is capable of determining various parameters or situations, in particular physical, physiological and biometric parameters of the moving entity 13 tracked with the tracking unit 7, or the occurrence of a situation that is associated with a risk, such as a risk of an accident, for example.
  • Figure 2 also shows the displaying units 10 and 11, on which data calculated by the analyzing unit 9 is presented in an appropriate form to be understood by an observer. The invention takes into account that in dependence of the observer (medicinal staff, umpire, spectator), different data may be helpful or necessary, which is why there are different display units 10, 11.
  • one display unit 10 may be used to display data in real time, whereas the other display unit 11 may be used to display recapitulative information, for example information covering a distance, a time period, and/or an entire performance of a moving entity.
  • the display unit 11 may be used to display average values (average speed, etc), or trajectories.
  • the display unit is used to produce a safety measure with respect to the occurrence of a risk or otherwise undesired situation.
  • the detection unit 6, the tracking unit 7, the identification unit 8 and/or the analyzing unit 9 may be separate physical entities, for example one or more computers or hardware components, or they may be in the form of software components.
  • the data processing unit 5 is one or more computers comprising software and/or hardware components functioning as one or more selected from the group of detection, tracking, identification and/or analyzing unit 6, 7, 8, 9.
  • the tracking system 1 of the embodiment shown in Figure 3 A comprises a plurality of cameras 2.1, 2.2, 2.3, which observe adjacent and slightly overlapping areas of interest 2.1, 2.2, 2.3.
  • the moving entity 13 can be tracked over a larger overall area, the larger area being the sum and/or combination of the separate areas 12.1, 12.2, 12.3.
  • the cameras 2.1, 2.2, 2.3 are synchronized so as to allow continuous tracking of the moving entity 13.
  • the tracking system of the invention comprises a plurality of cameras are adapted to observe and/or produce images of a continuous geographical area 12.1, 12.2, 12.3 formed by said adjacent and/or overlapping geographical areas 12.1, 12.2, 12.3.
  • the tracking system of the invention comprises a plurality of cameras, wherein said cameras are positioned so as to increase and/or optimize the observability of said object and/or individual 13.
  • the tracking system of the invention comprises a plurality of cameras, 2.1, 2.2, 2.3, so as to allow the detection of an object and/or individual 13 and/or of a tag 19 placed on said object and/or individual 13 even if said object, individual 13 and/or tag 19 cannot be detected from one of said cameras 2.1, for example due to an unfavorable orientation of said object, individual 13 and/or tag 19 with respect to said camera 2.1.
  • Figure 3 B shows a tracking system according to an embodiment of the invention, wherein two or more separate cameras 2.1, 2.2 are used to observe a defined geographic area of interest 12.1.
  • the plurality of cameras 2.1, 2.2 observe the area 12.1 from different positions and/or angles.
  • the availability of differently positioned cameras observing a defined geographic area 12.1 improves the level of observability, increasing the probability that at least one camera captures images that can be used for generating and/or retrieving data and parameters as defined in this specification.
  • Figure 3 B shows that adjacent and/or slightly overlapping geographic areas 12.1-12.3 are observed, each area by a pair of cameras, 2.1 and 2.2; 2.3 and 2.4, and 2.5 and 2.6, respectively.
  • Two cameras monitoring (at least part of) a given zone may be of the same type, for example two visible light cameras, or may be of two different types, for example: (1) a visible light and an infrared camera, (2) a visible light and hyperspectral and/or multispectral camera, and, (3) and infrared and a hyperspectral and/or multispectral camera, for example.
  • the tracking system of the invention comprises at least two different cameras, a first camera 2 and a second camera 3, wherein said first and second cameras are sensitive to and/or capture light of different wavelengths and/or wavelength ranges.
  • the tracking system of the invention comprises at least two different cameras 2, 3, a first camera 2 and a second camera 3.
  • said first camera 2 is capable of capturing visible light
  • said second 3 camera is capable of capturing infrared light.
  • Figure 4 illustrates an embodiment of the tracking system during operation.
  • a reference zone or zone of analysis 14 is detected and tracked by the image processing and/or analyzing software.
  • image analysis is performed, for example in order to determine the heart rate.
  • Heart rate is generally expressed as the number of heart beats per time interval, for example heart beats per minute (bpm).
  • speed and heart rate of the horse (in this case 25 km/h and 67bpm) are displayed in real time.
  • Camera 2 may be a visual light camera and camera 3 may be an infrared camera, guided and/or controlled by the data processing unit to capture a close-up of image 30, in which the head of the horse is detected and tracked, so as to obtain an image containing more image data of the reference zone 14.
  • Figure 5 schematically illustrates the determination of the number of steps conducted by the moving entity 13, here a horse, between two obstacles 18.1 and 18.2, which may also function as fixed reference points in the geographic area of interest 12.
  • a zone of analysis or reference zone 14 is identified, tracked and analyzed. Suitable algorithms are applied in order to determine the number of steps/strides conducted by the moving entity, between the obstacles 18.1 and 18.2.
  • the distance between the obstacles 18.1 and 18.2 can serve as a reference distance, and when the moving entity 13 passes the distance, the speed of the entity 13 when passing the reference distance can be determined.
  • the number of steps used to cover the reference distance may be displayed.
  • the number of steps/strides may be determined by the analysis unit 9, for example from the analysis of the movement, which may determine the time interval of the movement and/or the contact points 32 with the ground surface, for example.
  • Figure 6 schematically illustrates the tracking of a plurality of moving individuals within the geographic area observed by the tracking system. In the embodiment shown, the tracking system is used to monitor a horse race with several competitors 13.1, 13.2 and 13.3 running simultaneously. Parameters determined by the data processing unit 5 are displayed on screens 10 and 11. Screen 10 shows an overview image indicating the trajectory of the moving individuals in the area of interest 12 as they advance. In an embodiment, screen 10 is a first screen and is accessible to spectators and/or the public of a sports event, for example.
  • Screen 11 shows more detail, including physiological data, such as heart rate (“Heart”), respiratory rate (“Breath”, the body temperature (“Temp.”) and transpiration (“Skin”).
  • screen 11 is available to an umpire and/or to a medical and/or veterinary staff.
  • the data shown on screen 11 allows the appropriate person to assess the health status of the moving individual, for example the horse.
  • the parameters allow an assessment of the risk of an adverse event, such as exhaustion, accident, and/or injury, for example. If the risk of an adverse event exceeds a predetermined threshold value, the tracking system of the invention produces a safety measure, such as a visible and/or audible warning, for example.
  • the system comprises at least one safety rule, which allows it to determine if the health of an individual is in danger.
  • the warning may call a participant individual to stop the race, or a visible warning may be displayed on a screen. In this way, the participant individual may be forced to interrupt the race and/or be withdrawn from the competition, or any other measure may be taken to avoid an accident and/or harm.
  • the trajectory of a moving entity may be determined from the positions of the moving entity on successive images taken by the camera.
  • said data processing unit 5 is adapted to determine a trajectory 16 of an entity 13 by determining and/or storing successive positions of said entity on the geographical area of interest 12.
  • an extrapolated trajectory may be obtained, indicating future positions of the moving entity.
  • the future positions of a moving entity in dependence of time within the area of interest may be determined and/or calculated.
  • the tracking system of the invention preferably comprises safety rules that take the extrapolated trajectory and/or future positions at particular points in time of a moving entity into account. For example, from extrapolated trajectories, direction of movement and/or speed of two or more moving entities, the system may determine if the two or more entities are in collision course with each other.
  • the system comprises a safety rule based on the presence of a collision course between two moving entities, one moving entity and a stationary object or a moving entity and an entity that can move but does not move, for example because it is or has stopped from moving.
  • the system of the invention may also calculate the time remaining until a calculated or projected collision and may take the remaining time into account as part of the safety rule.
  • the system may produce a safety measure if an entity is predetermined or projected to enter in collision within 1 minute or less, for example 30, 20, 15, 10 or 5 seconds or less based on parameters such as the extrapolated trajectory, direction of movement and speed of one or more moving entities, for example.
  • detrimental exhaustion that can even lead to death are a widespread problem that the present invention alleviates.
  • Figure 7 illustrates the use of the tracking system of the invention in a professional environment, for example a warehouse, in which motorized vehicles 13.1, 13.2 circulate.
  • the geographic area of interest 12 covers at least part of the area in which the vehicles 13.1 and 13.2 - here forklift trucks - operate.
  • the invention envisages situations in which one or more pedestrians are present, in addition to vehicles. The combined occurrence of vehicles and person advancing by foot in a given geographical area, makes the use of the system of the invention even more necessary and advantageous.
  • a "professional environment” or an area covering a professional activity for the purpose of the present invention, preferably concerns environments where human operators execute manual work and/or operate moving objects, engines that have moving components, and/or vehicles.
  • Typical professional environments are building, construction and industry sites, warehouses, manufacturing sites, factories, mills, plants, sites of packaging of goods, sites where goods are transported and/or exchanged, sites where goods are filled and/or transported, and the like.
  • the data processing unit 5 is adapted to evaluate the data created by the image analysis unit to determine the risk of an accident. In case the tracking system detects a dangerous situation, a warning may be produced, which is preferably output, for example by way of an acoustic signal or on a screen 11. Data are preferably stored on a memory, for example in a database 15, which allows statistical or other analysis, for example in order to improve the working process and/or behaviour of the drivers.
  • the data processing system 5 may also measure sound signals, which may be used to determine the occurrence of malfunctioning devices or vehicles, impacts and/or collisions.
  • the data processing unit 5 is configured to detect, identify and track entities 13.1, 13.2 appearing in the area of interest 12 and determine relevant parameters such as position, orientation speed, and compare the data produced with rules, in particular safety rules. The data processing unit 5 may then check if conditions of the rules are met and, if so, triggers the appropriate signal, warning or other action.
  • rules are, for example, the pedestrian-vehicle distance, the loaded vehicle (forklift) direction, the presence of objects in a pathway/direction of a vehicle, the distance between vehicles.
  • a warning signal
  • the working process is interrupted, for example.
  • the tracking system of the invention comprises one or more safety rules, preferably a set of several safety rules.
  • the safety rules are preferably part of the analyzing unit 9.
  • the tracking system is preferably programmed to contain said safety rules.
  • the safety rules are preferably associated with the physical, physiological and/or bio metric parameters.
  • the safety rules determine threshold values, which, when met surpassed and/or undercut, indicate the presence of an undesired situation, such as an increased risk of accident, for example.
  • the system of the invention is preferably configured to determine whether there is a breach of any safety rule, and, in case of breach, produce a safety measure.
  • Exemplary safety rules are values and parameters selected form the group of: the speed of a moving entity, the distance between two different entities 13.1, 13.2, if at least one of the entities is moving, the change of the distance between two entities, the driving sense of a moving entity 13 (forwards or backwards), the presence and volume of a charge on a charged moving entity, the presence or absence of safety equipment and the like.
  • the size or volume of a charge on a vehicle may have an impact on the overview of the driver. If the load is too large, the tracking system may detect the violation of a safety rule and take a safety measure.
  • the safety rules are preferably monitored continuously and in real time by way of algorithms contained in the analyzing unit 9.
  • the tracking system is configured to detect or determine, by comparing said parameter with said safety rule, the occurrence from an undesired situation selected from one or more of the group of: (1) a deviation from safety rules, (2) an increased risk of accident, and, (3) potentially harmful events.
  • the tracking system of the invention is configured to detect a breach of a safety rule from one or more parameters selected from the group of: position, the orientation, the direction, the speed the entity 13.1, 13.2; the distance between two different entities 13.1, 13.2, the distance between a moving entity and a stationary entity or object, and the direction of movement between a moving entity and a stationary object or entity, and combinations of the aforementioned.
  • said system preferably comprises safety rules for determining the presence of ta risk for the health of the entity 13.
  • the entity of interest 13 may comprise a tag 19, as shown on the vehicles 13.1, 13.2 in Figure 7.
  • the tags 19 facilitate identification and tracking by the tracking unit 7 and identification unit 8 (Fig. 2). It is noted that depending on the angle of view of the moving object with respect to the camera, the tracking of a defined moving object of interest may be difficult, because of the change of the outline of the object in dependence of perspective. In these cases, the use of a tag allows rapid and reliable identification and/or tracking. In other embodiments, a tag is not necessary and may thus be absent, because the camera is capable of reliably identifying a particular object without the need of a tag.
  • the tracking system of the invention is in particular useful for avoiding accidents in stressful or hectic situations, in moments of increased workload or increased professional activity and/or in moments of increased pressure, such as time pressure, or pressure on the staff in general. It has been observed that in such situations, operators tend to neglect or forget safety rules, which is one reason why accidents in a professional environment occur more frequently in such situations.
  • Figures 8 A through 8 C and 9 A- 9 C illustrate another embodiment of the invention related to the reduction of risks of accidents occurring in a professional environment and/or improving safety during professional activity.
  • Figs 8 A-8 C are original image data (photographs) which may be analyzed by the tracking system 1 of the invention, whereas Figs 9 A-9C illustrate similar images by way of drawings. Most reference numerals are inserted in Figs. 9 A-C only.
  • the moving entity 13 (here: individuals 13.1, 13.2, 13.3 in Figs. 8 A-9C, respectively) is a human individual.
  • the area of interest 12 may be a professional workplace as shown in Fig. 7.
  • the camera of the tracking system may monitor an access to the working place, so that each human operator has to pass in front of a more restricted area covered by the camera.
  • the access may be a door or a corridor by which the operators 13 access to a workplace, for example.
  • the tracking system 1 is configured to identify the head 31 of the individual so as to determine the reference zone 14. The system then checks if safety equipment 21 is present in or close to the reference zone 14.
  • the system 1 may be configured to identify a more restricted, second reference zone 14.1 within a larger, first reference zone 14. For example, in a first step the head of the moving operator 13(13.1-13.3) is identified for defining a first reference zone 14, and, in a subsequent step, the eyes of the operator are identified so as to define a second, smaller reference zone. The image analysis is then conducted within the second reference zone. In the embodiment shown, the system is configured to identify the zone of the eyes of the individual within the first reference zone 14, so as to determine the postion of the second reference zone 14.2.
  • the safety equipment are safety glasses 21, which the tracking systems seeks to identify within the reference zone, in particular at the position of the eyes of the operator.
  • the tracking system is configured to determine a parameter, said parameter allowing the detection of the occurrence of an undesired situation.
  • the system of the invention uses said parameter for detecting the occurrence of an undesired situation, such as the breach of a safety rule.
  • the undesired situation is selected from one or more of the group of: (1) a deviation from safety rules, (2) an increased risk of accident, and, (3) other potentially harmful events.
  • the tracking system detects any one of (1), (2) or (3) in a professional environment.
  • said parameter is related to the presence or absence of a safety equipment object 21 in said reference zone 14.
  • the safety equipment is selected from the group consisting of: safety glasses, a hardhat, a safety helmet, gloves, shoes, a life vest, and high visibility clothing and combinations comprising two or more of the aforementioned.
  • the safety equipment comprises a particular tag, or a material reflecting light, such as light of a particular wavelength, for example. The particular light reflected by the safety equipment may be detected by the camera 2, 3.
  • the tracking system of the invention may comprise a source of light.
  • the tracking system of the invention further comprises a light source 20 capable of emitting light, and wherein said data processing unit 5 is adapted to determine, from the light reflected from the surface of said object and/or individual 13, a parameter with respect to said individual 13.
  • a life vest or any other safety equipment may comprise a material or surface reflecting a particular light, for example IR or NIR light.
  • the camera of the tracking system is selected so as to capture the light of the particular wavelength as reflected by the material or surface, for example a IR, NIR or thermal camera in case of a material or surface reflecting IR or NIR light.
  • the light source is preferably selected so as to produce light that is reflected by the material or surface, such as IR or NIR light source in case of an IR or NIR reflecting surface.
  • the use of a light source may thus facilitate the identification of an entity and/or the presence of safety equipment.
  • the tracking system of the invention is configured to produce a safety measure if an undesired situation is detected. In this manner, the tracking system is used for assuring the compliance with and/or the implementation of safety rules, such as the wearing of safety equipment during work.
  • safety glasses are absent in the reference zone 14, and the tracking system of the invention will take a safety measure, such as the production of a warning.
  • the safety measure is the production of a visible or audible signal or message via said output unit 10, 11.
  • the safety measure may be the sending of a message to a telephone or a computer.
  • the system of the invention may be configured to send an email or sms to the mobile phone or smart phone of an individual concerned by the breach of a safety rule.
  • the safety measure may be the sending of an alert message to an address, email address or phone number of an individual that conducted the breach of a safety rule, for example, and/or to an individual otherwise concerned by the breach of the safety rule, for example to the individual that suffers an increased risk or undesired situation due to the breach of a safety rule.
  • the output unit comprises an alarm notification appliance, such as a visible alarm signal entity, such as an alarm light, or an audible alarm entity, or a combination of both.
  • an alarm notification appliance such as a visible alarm signal entity, such as an alarm light, or an audible alarm entity, or a combination of both.
  • the output entity may comprise a screen indicating a warning, reminding the operator of the fact that he/she does not comply with safety rules at the working space.
  • the screen is preferably placed so as to be easily visible to the operator 13, for example placed at eye's height next to the entrance to the working area.
  • an audible warning may be produced. Audible warnings or may include warning tones or a computer voice pronouncing the appropriate warning.
  • the warning may be made with reference to the name of the operator.
  • the computer voice may directly call the operator's name.
  • the safety measure directly interrupts the process that is associated with the breach of safety rule or the increased risk of accident or other potentially harmful event.
  • the tracking system may be used to prevent the harmful event directly.
  • the tracking system is preferably connected with a physical entity and acts on a physical device.
  • the tracking system may block access to the working space for the operator, for example, by blocking a door and the like, or may stop vehicles by remotely controlling such a vehicle.
  • the output entity comprises suitable equipment to send a signal that is received by the vehicle and that stops the vehicle actively.
  • the tracking system may also inform another central processing unit of the increased risk of an accident, said other data processing unit being able to control vehicles, moving objects or other machines having moving components.
  • the invention encompasses that the tracking system produces more than one safety measures, for example a combination of two or more safety measures as set out in this specification or other safety measures.
  • Safety measures for the purpose of this specification, encompass any measure that has the purpose of undoing or reducing the occurrence of an undesired situation, of an increased risk of accident, or other harmful events, or which directly provide rapid assistance or help in case of the occurrence of an undesired event or situation, such as an accident.
  • Figure 8 B illustrates the situation where the tracking system of the invention identifies the safety equipment and does not produce any warning, because the operator 13 complies with the safety rules.
  • the tracking system may also produce a positive message in such a case, such as a smiley displayed on a screen or an encouraging audible message.
  • the algorithms of the tracking system of the invention are capable of distinguishing usual gear from safety equipment.
  • the tracking system is configured to recognize the appearance of normal glasses 24 (for example, for correcting vision) in the area of interest 14 and to determine that the normal glasses are different from the safety glasses. Therefore, a warning or safety measure is preferably also produced in the case of Fig. 12 C.
  • the safety equipment is a helmet 21, 21.1.
  • Fig. 10 is an image 4 taken by an RGB camera at the entrance to a working space.
  • Fig. 11 reproduces the image 4 of Fig. 10 as a drawing for better illustration with reference numbers.
  • the tracking system of the invention is configured to identify moving human operators 13.1 and 13.2, and to identify the respective area of interest 14.1 and 14.2 associated with each human operator, each operators being a "moving entity 13" in accordance with the invention.
  • the tracking system is configured to analyze the area of interest 14.1, 14.2 (here: the head of the operator) and determines the presence or absence of a safety equipment 21 (here: a hardhat) in the areas of interest 14.1, 14.2.
  • a warning or safety measure is produced, for example a visible or audible warning as exemplified elsewhere in this specification.
  • a hardhat 21 is detected within the respective reference zone 14.2, and no warning is produced with respect to operator 13.2.
  • the analyzing unit may also detect the presence of hardhat 21.1, which is carried in the hand by operator 13.1.
  • hardhat 21.1 is not present outside the reference zone, a safety measure is still produced.
  • hardhat 21 carries a tag 19 for facilitated recognition and hardhat 21.1 does not carry such a tag. The invention can be performed with or without the use of such a tag.
  • the algorithm of image analysis has to be adjusted with respect to whether or not a tag is used.
  • the use of a tag may facilitate the algorithms, because the same recognition pattern (tag) can be used with respect to different situations.
  • the safety equipment 21 comprises a tag 19, which allows the analyzing unit of the tracking system to identify the presence of the safety equipment rapidly.
  • the tag 21 may facilitate the detection of the safety equipment in the reference zone 14.
  • the analyzing unit is configured to identify a particular safety equipment 21 (helmet, safety glass, etc), even without the need of a tag. Whether or not a tag is used may depend on factors such as the area covered by a single camera, the quality of the camera and the quality of the algorithms used by the tracking system and in particular the identification unit 8 of the system (Fig. 2).
  • a visible light sensitive camera is used for monitoring a relatively restricted area covering the access to a working space. Thanks to the relatively small area 12 (compare Fig. ) to be covered, the use of a tag on the hardhats 21 is optional, as the system may well recognize a helmet even without the use of a tag. It is further noted that some safety equipment, such as glasses, may provide little place for applying a tag, which is why in the embodiments shown in Figs 8 A through 9 C no tag was used for detecting the presence or absence of safety glasses.
  • the system of the invention observes the occurrence of a deviation from safety rules.
  • the tracking system uses the one or more parameters for detecting the breach of a safety rule.
  • safety rules may be the requirement for human operators to wear safety equipment or rules with respect to the handling and/or operation of vehicles and machines.
  • safety rules may comprise speed limitations, required distances between vehicles or between a vehicle and a stationary object or an operator.
  • the system of the invention may detect the presence of an increased risk of an accident from direction of movement (trajectory or extrapolated trajectory) of a moving entity, speed of the entity and from the presence and/or behavior of other entities, such human operators or stationary objects.
  • Safety measures are preferably takes as soon as a breach of a safety rule and/or an increased risk of accident is detected.
  • the tracking system of the present invention is operational on the basis of image-related data only, although the combination with sensors and the like may result in more exact and/or more reliable parameters and is also envisaged in some embodiments of the invention.
  • At least one camera that is used for the purpose of the present invention is pre-calibrated and/or installed before any moving entity is tracked. For example, once the camera is installed to observe a particular area of interest 12, deflection due to the lenses of the camera is determined and compensated by the data processing unit. Preferably, an algorithm is applied, which produces an image in which deflection has been corrected. In a second step, a grid pattern-based positioning system is applied to the image created by the camera, so that any given pixel/position in the image can be attributed to a position on the ground as shown in the image. This may be performed using led signals positioned at known distances and angles, for example in the corners of a square provided on the ground on the observed area ( Figure 15).
  • the ground visible in image 4 provided by the camera can now be represented as a coordinate system 17.
  • Each pixel or position in an image captured by the camera can now be assigned to a position, which can be expressed in terms of values of the two axis of the coordinate system 17.
  • any pixel of the image that falls in the coordinate system corresponds to a position.
  • Image pixels that are in the region where there is no ground surface do not represent any position. This is the case, for example, in the top part of the image of Figure 14, where a wall is seen.
  • said data processing unit 5 is adapted to associate a position on an image made by said one or more camera with a position on the ground of said geographical area 12. In an embodiment, said data processing unit 5 is adapted to determine a position on the ground of said geographical area 12 from a position of said entity in said image (4), and/or from the position of said reference zone 14.
  • the geographical area of interest 12 is a substantially flat and/or even surface. If the area 12 comprises uneven parts, the data processing unit 5 may ignore them and consider the area to be even.
  • said data processing unit 5 of the tracking system of the invention is adapted to calibrate an image 4 of said one or more camera by generating a coordinate system 1), wherein a position A in said coordinate system 17 represents a position on said geographical area of interest 12, wherein said data processing unit 5 is configured to associate said coordinate system 17 with said images 4 generated by said camera, and/or wherein any pixel of said image taken by said camera can be attributed to a position of said geographical area 12 on the basis of said coordinate system 16.
  • said image analysis unit 9 is adapted to determine the position where an entity 13 is in contact with the ground in said geographical area 12.
  • Figure 15 shows a device 33 that can be used for rapid geometric correction and/or calibration of a camera used in the system of an embodiment of the invention.
  • the device comprises flexible bands or cords 35 of a determined length.
  • Light sources 34 are provided in the four corners of the determined geometric shape of the spread and/or unfolded device, here a rectangle. In the device shown, here is also a light source in the center of the device, where the diagonals of the rectangle cross.
  • Light sources 34 or assemblies of light sources of one or more wavelengths may be used, so that the camera to be calibrated detects the light source.
  • each light source is an assembly comprising two light sources, one emitting visible light (small empty circle in light source 34 in Fig.
  • the device 33 is preferably useful for calibrating cameras that are sensitive to different light wavelengths.
  • the device 33 is spread at different positions in the geographical area of interest 12, and the data processing unit 5 to geometrically calibrate the camera using the signals produced by the light sources 34.
  • the data processing unit uses the known distances between the light sources to calibrate the camera.
  • Figure 14 is an image 4 taken from a geographical area of interest 12, here the scene of a show-jumping course. The camera has been calibrated so that a coordination system or grid 17 can be overlapped with the image.
  • the camera is preferably immobilized, positioned in a fixed manner, or the specific position and adjustment of the camera is determined, so that if an image is taken with this particular view, any pixel can be associated with a position on the ground. If the camera is motorized to be directed to another area of interest, the position for which it has been calibrated is preferably stored, so that the same position can be adjusted automatically at a later point in time, for example.
  • the data processing unit 5 of the tracking system 1 of an embodiment of the invention has the information required for determining the position of a detected entity in the geographical area of interest 12.
  • the detection unit 6 the presence of a moving object of interest 13 is detected by the detection unit 6.
  • the analyzing unit 9 can determine the position of the tracked object. If the camera is positioned so as to yield a perspective view, the lowest point of a detected entity 13 on an image 4 may be used as the position of the entity in the area 12. In a perspective or front elevation view, the lowest point generally is the point of contact of the entity 13, for example a moving individual, with the ground, which is why the position of this point can be identified using the coordination system 17, for example.
  • the image analysis entity 9 preferably determines a center of the detected entity and uses the position of this center in the coordinating system to determine position of the entity in the geographical area.
  • the data processing system 5 and in particular the image analysis entity 9 can determine several parameters from the position of the entity at a given point in time.
  • the trajectory and the speed may be determined. From the trajectory, it is possible to determine direction of movement and possibly the orientation of the moving entity.
  • the invention encompasses skeletal tracing of a living, mammal individual moving in the area of interest 12.
  • the system of the invention may, for example a Kinect hardware component and/or commercially available SDK software development kits for skeletal tracking.
  • Skeletal tracing may be used, for example, to determine the orientation of a living individual, and/or for determining the number of steps/strides performed by an individual, for example within a given distance or course.
  • the invention encompasses determining the respiratory rate of a living individual in the geographic area of interest. Respiratory rate may be determined using one or more infrared and/or thermographic cameras, for example, in particular if the ambient temperature is sufficiently different from the body temperature of an individual.
  • the system 1 on the invention comprises at least one infrared camera 3, for example in addition to a visible light camera 2 ( Figure 1).
  • a similar processing unit as disclosed in Figure 2 may be used.
  • a tracking unit and/or identification unit may identify the head 31 of a moving individual and track the head and/or the area in front of the head, in the direction of the orientation of the head ( Figures 12 A and 12 B).
  • the zone of interest 14 that the image analysis unit preferably analyzes is in the center of the left half of the image.
  • the cycles of exhalation ( Figure 12 A) and inhalation ( Figure 12 B) can be derived using a suitable algorithm, and the respiratory rate can be determined. It is noted that the images in Figure 12 A and 12 B were treated so as to show in black the pixels that exceed a threshold brightness value. In this manner, the identification of exhalation is facilitated.
  • a thermal camera 3 is used for determining respiratory rate, the latter can be provided at a fixed position, as described above, for example.
  • the camera is preferably equipped with the required resolution and/or detection capacity, containing a large matrix of sensitive pixels, allowing the reliable analysis of a comparatively small area of the image taken by the camera.
  • a thermal camera 3 with a comparatively lower resolution and/or detection capacity may be used.
  • a motorized camera may be used, so that it can be directed to a particular zone of interest, where there is a moving entity 13.
  • the data processing unit 5 may drive the motors of the infrared camera 3 to keep it directed towards the moving entity 13.
  • the data processing unit 5 may use the position of the moving entity as determined with the aid of the visible light camera 2 as described elsewhere in this specification.
  • the infrared camera 3 may be connected to a data processing unit containing its own detection, tracking and analyzing units, so that the infrared camera is directed towards the moving entity on the basis of images obtained from the infrared camera itself.
  • the tracking system of the invention comprises at least one camera that is sensitive to infrared light and/or that captures infrared light.
  • the tracking system comprises a camera selected from infrared (IR) cameras, near infrared (NIR) cameras, and thermal cameras.
  • the two cameras may be directed to capture images from one or more selected from the group consisting of: (1) substantially the same geographical area of interest 12; (2) from at least partially different geographical areas of interest 12.1, 12.2, etc; (3) from a geographical area of interest 12 and from a (smaller) area within the geographical area of interest; (4) from a geographical area of interest and from an a view in the geographical area of interest, wherein the view may capture an image from an area that is partly or totally outside the area of interest, but which may capture an entity staying in the area of interest; and (5) from an area of interest 12 and an area that is not of interest, wherein the latter may be used for calibration purposes and/or for generating reference and/or comparative data, for example, reference data of one or more of the parameters described in this specification.
  • Figure 16 is an image taken by an infrared camera. Due to the high contrast between the living subject and the background obtained with this camera, the use of the infrared camera is particularly useful to rapidly identify a moving object within an area of interest.
  • the tracking system of the invention may combine the use of an infrared camera with an RGB camra, for example, wherein the infrared camera is generally used to rapidly identify the moving entity, and the RGB camera is used in addition for image analysis, for example, being used for retrieving information from the reference zone, for example.
  • both cameras the infrared and the other camera used in combination with the infrared camera are synchronized.
  • the invention encompasses displaying the respiratory rate on a display 11 (Figure 4), optionally together with other physiological data, as illustrated in Figure 6.
  • the invention provides the determination of a heart rate of a tracked individual.
  • Figures 13 A and 13 B illustrate the determination of the heart rate of an individual using a hyperspectral and/or multispectral camera, for example.
  • Figure 13 A is produced using a hyperspectral camera directed to a position on a body, which here is the common carotid artery.
  • One image is produced as a line across the zone of interest, and successively produced image-lines are arranged one below the other, so as to yield Figure 9A.
  • Figure 13 A shows the light intensities at specific wavelengths as determined by the hyperspectral camera. Fluctuations in intensity corresponding to the heart rate can be seen with some of the wavelengths.
  • the white rectangle in Figure 13 A represents the zone of interest 14 that will be analyzed by the analyzing unit 9, for example.
  • the data processing unit 5 can determine the heart rate of the individual. In case of a moving object, care has to be taken to keep the camera directed to the zone of interest. As discussed above with respect to the determination of respiratory rate, it is envisaged to use a motorized camera, which is guided to remain oriented to the zone to be analyzed of the moving individual.
  • the present invention envisages determining heart rate at a predetermined position, for example within the geographical area of interest 12.
  • the camera may be position to observe an area where the moving individual, for example the horse, is required and/or expected to slow down.
  • heart rate may be determined at specific moments and/or when the individual passes specific, selected posts, and not necessarily over the entire course and/or trajectory.
  • camera 3 in Figures 1-3 is a hyperspectral camera, for example, it may not observe the same geographical area as the visible camera 2.
  • the second camera 3 may be positioned or motorized so as to be directed towards a selected extract within the geographical zone of interest 12, for example.
  • the tracking system of the invention comprises a plurality of cameras, 2.1, 2.2, 2.3, 2.4, 2.5, 2.6, wherein one or more of said cameras are adapted capture images from a geographical area of interest 12.1, or - from within the area 12.1 - from an extract or smaller area, and/or from different positions and/or angles.
  • the invention further encompasses determining transpiration of an individual using, for example, camera and image analysis tools.
  • a camera may be positioned to cover the same geographical area 12 as the visible camera 1, or another zone, or an extract within the area 12.
  • the camera used for determining transpiration may be a visible or hyperspectral camera, for example.
  • the tracking system comprises an IR, NIR or thermal camera, and wherein said analyzing unit 9 is adapted to determine, from an image of said individual obtained with said IR, NIR or thermal camera, a status of transpiration of said individual.
  • transpiration of an individual may be determined from the extent of light reflection from the body of the individual.
  • a light source (not shown in figures), to facilitate detection of a parameter, such as a physiological parameter.
  • a light source is used that accentuates reflection in presence of transpiration.
  • a light source emitting polarized light may be used.
  • the camera capturing the reflected light may be equipped with polarizing filters, so that light that does not originate from reflection of the polarized light can be removed for the purpose of analysis, for example.
  • a light source favoring reflection may be used.
  • the invention encompasses tracking transpiration continuously, at specific time intervals and/or at specific positions, for example posts of a course where an individual is expected to pass.
  • the tracking system 1 of the invention comprises one or more light sources.
  • the invention encompasses the combination of image analysis technology, for example as described herein, with other tools of analysis, so as to render the determined parameters more reliable, more precise, for example, or to have redundant data that can be used for allowing control of the correct operation of the tracking system, for example.
  • Other analysis may include other non-invasive analysis, such as sonar measurements, or the use of sensors placed by the moving entity.
  • the tracking system is adapted to generate redundant data with respect to a parameter of said entity 13, for example using the microsystem as specified elsewhere in this specification.
  • the invention provides the measuring of sound and to synchronize the sonar measurements with the visually measurements and/or visually determined parameters.
  • the noise emission produced by the steps of a horse may be used to determine the number of steps during a time period or within a trajectory.
  • the steps determined with sound measurements may be synchronized with the steps as determined from tracking with cameras and the data processing unit 5.
  • a network based on wireless sensors is preferably used in combination with the contactless measures.
  • a "microsystem" is a an assembly of electronic circuits assuming the function of a sensor and having access to a wireless transmission for allowing a real time use of the transmitted information.
  • the microsystem comprises comprises an electronic circuit comprising a sensor 26 and a transmission system and/or a transmitter 27, for transmitting data produced by said sensor.
  • the tracking system comprises one or more sensors 26 adapted to be carried on or by said entity 13, wherein said sensor is selected from any one or more of the group consisting of: heart rate sensors, galvanic skin resistance (GSR) sensors, inertial sensors, such as gyroscopes and/or accelerometers, position and orientation sensors, such as magnetometers and GPS sensors, sound sensors and microphones, distance meters, pressure transducers, transponders and/or transmitters allowing direction finding (DF) and/or triangulation, temperature sensors.
  • GSR galvanic skin resistance
  • the tracking system of the invention comprises a receiver unit 28, adapted to receive data transmitted by a transmitter 27 and/or microsystem 25 placed on said entity.
  • the tracking system of the invention is adapted to determine a given parameter of said entity 13 from image-related data of said one or more cameras 2, 3 and from data generated by a sensor 26 carried by said entity 13.
  • the tracking system is configured to determine simultaneously or nearly simultaneously, and independently, a given parameter from said image-related data and from said sensor-generated date, respectively. In this manner, the tracking system obtains the same parameter or comparable parameters from independent sources.
  • the date may be considered redundant data, and may be used as a control.
  • the system may comprise algorithms for transforming redundant data so as to render comparable the values related to a given parameter but determined in different ways.
  • the system of the invention may comprise a safety rule which depends on the similarity of the redundant data related to a given parameter.
  • the breach of a safety rule may be found if the data obtained from different sources (for example: image related data and sensor-related data) but related to a given parameter diverges beyond a particular threshold value.
  • sources for example: image related data and sensor-related data
  • a "manual" calibration by placing a device with light sources as shown in Figure 15 on different areas of the geographical area of interest could be omitted.
  • the combination of a test system of intermediate values of the algorithm can be consulted in real time in combination with the tracking system, for example: (1) with a single camera for simple cases, (2) with several cameras having complementary fields of vision in order to increase the surface of the observed zone, (3) with a stereo camera or a plurality of cameras observing the same zone form different viewpoints in order to monitor the actors in a complex environment.
  • Two or more types of cameras may be combined for tracking using properties of the subject, such as heat release or reflection of UV light in addition to information available from the visible light spectrum.
  • the invention encompasses the combination of tracking and measuring physical, physiological and biometric parameters or characteristics. In an embodiment, the invention encompasses combining tracking with sensors or tags provided directly on the respective entity 13 on an object used by the entity, for example a golf club or polo mallet.
  • the tracking system of the invention comprises a hyperspectral and/or multispectral camera.
  • the invention encompasses combining tracking with a hyperspectral or multispectral camera in order to retrieve physical, physiological and biometric parameters without exerting any impact or effect on the tracked subject 13, for example.
  • said one or more data processing unit 5, in particular said image analysis unit 9, is capable to and/or configured to detect, on a reference zone 14 on the skin of said individual, a change of light intensity.
  • said data processing unit 5, and in particular said image analysis unit 9, is adapted and/or configured to detect, from said light intensity, the presence of water and/or a transpiration of said individual.
  • the invention encompasses combining a light source suitable to elicit biometric characteristics from the surface of the subject 13 (hair, skin, pelage (of an animal), paint). In an embodiment, the invention encompasses the exploitation of tags placed on the moving and/or tracked entity 13 for facilitating classification, ballistic measures and/or the identification of the entity 13.
  • the invention encompasses the exploitation of tracking information for performing automated comparisons of trajectories and/or strategies, for example in order to understand and resolve logistic problems, or, during a sports event, tracking of a horse during a horse race.
  • the invention encompasses measuring, in an automated manner, the position, orientation, direction of movement, speed and stride, for example of a horse during a horse race competition.
  • the invention makes use of tracking information for determining the risk of an accident and offering a statistic on nearly- accident or almost-accidents of an actor, with or without access to the corresponding images.
  • the tracking system is combined with sensors associated with a tool for charging contents of software or firmware in one or another organ of the system, and allowing in this manner an optimal functioning even in case of initial unforeseen events.
  • safety equipment such as safety glasses, hardhat or safety helmet
  • safety equipment outside the reference zone.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Alarm Systems (AREA)

Abstract

The present invention concerns a tracking system comprising one or more cameras adapted to take images, one or more data processing units, said tracking system further comprising one or more output or display units, wherein said camera is adapted to observe a geographical area of interest, wherein said one or more data processing unit is configured to detect a moving entity in the area of interest, to identify a reference zone in at least some of said images, wherein said reference zone is on or associated with said moving entity, and to analyze the reference zone in order to determine at least one parameter associated with said moving entity.

Description

Camera-Based Tracking System for the Determination of Physical, Physiological and/or Biometric Data and/or for Risk Assessment
Technical Field
The present invention generally relates to image generation and analysis technology. In addition, the invention relates to the generation of data or parameters related to tracking, and determination of physical, physiological and/or biometric properties using cameras and data processing units. More specifically, the invention concerns a tracking system, uses of the tracking system and methods as defined.
Prior Art and the Problem Underlying the Invention
In the prior art, several techniques for tracking a moving object or individual are proposed. US 2013/0188031 discloses a risk recognition system based on human identification. A person is photographed and identified on the basis of photograph data available with respect to the person in a database. By tracking the movement path of the identified human, the system is capable of identifying the possibility of a dangerous situation. US 2008/0129825 discloses an autonomous picture production system, comprising an object tracing device, a motorized camera and a camera control device. Each traced object carries a location unit transmitting a signal allowing the determination of the position of the object. This system may be used to produce images of an object or a person during a sports event, for example, in an automated manner.
US 2011/0208444 discloses a system for measuring balance and track motion in mammals. The system comprises a band configured for attachment to a body part of a mammal for sensing, for example, the circumferential pressure at the body part where the band is attached.
It is an objective of the present invention to track a moving entity without equipping the latter with a sender. It is an object to track a moving entity non-invasively, preferably without putting any additional constraint on the moving entity. It is an objective of the invention to determine parameters concerning the moving entity on the bases of camera- acquired images, using image analysis methods.
It is an objective of the invention to produce information regarding the status, position, speed, orientation, distance, direction of movement, the trajectory of a moving entity, and/or the period of time spent during a movement or trajectory, preferably using one or more cameras and image analysis methods.
In case of a living individual, such as a human or animal, it is a further objective of the invention to obtain information regarding the number of steps, for example between two reference points, the size of steps, biometric data and/or data concerning the status, for example physiological status of the individual, such as the respiratory rate, the heart rate, transpiration from a body, body temperature, and constraints, forces, tension and/or extension experienced by the body. It is an objective to determine these parameters preferably without physically contacting the moving entity, for example without equipping the moving entity with a sensor, sender, or other data generating and/or transmitting unit having additional weight and requiring attachment to the moving entity. Attachment of a data generating unit on a moving entity may in some way influence the moving entity, which the present invention aims to prevent. It is an objective of the invention to produce relevant data and parameters as specified from image-related data.
A further objective of the invention is to assess the risk of accident or injury of a moving entity, for example in traffic or during a sports event. It is in particular an objective to provide the possibility of producing a warning or intervening in case there is a risk of accident and/or injury, or any type of damage in general. For example, in case of equestrian sports, it is an objective to monitor the physiological status of a horse in order to determine if there is a risk of health damage, injury, or accident, for example. Furthermore, it is an objective of the invention to provide data related to participants of a sports event. For example, it is an objective to display data related to the data and parameters specified above, for example to render the sports event more interesting or to have further insight. It is an objective to provide data that can be displayed and commented in real time, while a sports event takes place, for example, to spectators or to medical personnel. It is also an objective to produce data that can be used for generating statistics, to compare performances, trajectories, movements and other parameters related to a competitive activity. It is also an objective to generate data that is helpful in assisting in analysis and improvement of a moving entity's performance, for example by improving trajectories, movements, for identifying weaknesses, and the like. It is also an objective to generate data that can be used statistics, for example for determining the number of situations that can be categorized as dangerous, such as "almost-accidents".
It is further an objective to produce data related to a parameter as specified above, wherein such data can be compared with data concerning the same parameter but determined in another way, so as to generate redundant data. More generally, it is an objective to provide ways for producing data that can, if desired be combined with data generated in invasive manner, for example by using a sensor and/or transmitter placed on the moving unit. Summary of the Invention
In an aspect, the present invention provides a tracking system comprising one or more cameras adapted to take images, one or more data processing units, said tracking system further comprising one or more output or display units, wherein said one or more camera is adapted and/or positioned to observe a geographical area of interest, and wherein said one or more data processing unit is arranged to detect an entity of interest in the area of interest. The entity of interest may be one or more selected from any object of interest, a person, and an animal. The entity may be moving. In an aspect, the tracking system is configured to identify a reference zone on at least some of said images, wherein said reference zone is on or associated with said entity in said image.
In an aspect, at least one camera of the tracking system is an infrared camera and the data processing unit is configured to identify in the images an individual, such as a person or an animal, and to determine from the images taken by the camera the individual's respiratory rate.
In an aspect, at least one camera of the tracking system is a hyperspectral and/or multispectral camera and the data processing unit is configured to identify in the images an individual, such as a person or an animal, and to determine from the images taken by the camera the individual's heartbeat rate. In an aspect, the data processing unit of the tracking system is configured to determine, from images taken from an individual, the individual's transpiration.
In an aspect, the invention provides the use of an infrared camera for determining the respiratory rate of an individual. In an aspect, the invention provides the use of a succession of images taken by an infrared camera for determining the respiratory rate of an individual.
In an aspect, the invention provides a method for determining the respiratory rate of an individual, the method comprising the step of determining said rate by analysing a succession of images generated by an infrared camera.
In an aspect, the invention provides the use of a hyperspectral and/or multispectral camera for determining the heart rate of an individual. In an aspect, the invention provides the use of a succession of images produced by a hyperspectral and/or multispectral camera for determining the heart rate of an individual.
In an aspect, the invention provides a method for determining the heart rate and/or pulse of an individual, the method comprising the step of determining said rate by analysing a succession of images generated by a hyperspectral and/or multispectral camera. In an embodiment, the tracking system of the invention comprises a hyperspectral and/or multispectral camera 3, wherein said one or more data processing unit 5, and in particular said image analysis unit 6 is capable of detecting, on a reference zone 14 on the skin of said individual, a change of light, and in particular light intensity within a given wavelength or wavelength range, and wherein said data processing unit 5 and/or said image analysis unit 7 is adapted to determine, from said light intensity, the event of a pulse beat and/or heart rate of said individual.
In an aspect, the invention provides the use of one or more camera and/or one or more images produced by the camera for determining a parameter related to transpiration of an individual.
In an aspect, the invention provides the use of one or more images for determining a parameter related to transpiration of an individual, the method comprising the step of determining transpiration from the reflection of light from the skin of the individual.
The tracking system of the invention may comprise safety rules associated with one or more of said physiological parameters, such as respiration, heart rate, transpiration, for example. The system may thus determine an undesired situation as defined elsewhere in this specification if threshold values with respect to one or more of these parameters are met, trespassed and/or undercut, for example.
In an aspect, the invention provides the use of the tracking system of the invention for monitoring, tracing, tracking and/or displaying one or more selected from the group of: a professional activity, a security and/or safety training event and/or security and/or safety exercise, a sports event, and a military training event.
In an aspect, the invention provides the use of the tracking system of the invention for monitoring, tracing, tracking and/or displaying the scene of a sports event, of a professional activity, of a training event, such as a security or safety exercise or training event, and of a military training event.
In an aspect, the invention provides the use of the tracking system of the invention for monitoring, tracing, tracking and/or displaying a sportsperson, for example at a sports and/or training event.
In an aspect, the invention provides the use of the tracking system of the invention for assessing the risk of an accident involving, for example one or more individuals and/or vehicles, and/or of bodily harm of an individual, for example a sportsperson, for example during training and/or competition.
In an aspect, the invention provides the use of the tracking system for detecting one or more selected from the group of: (1) a deviation from safety rules, (2) an increased risk of accident, and, (3) other potentially harmful events. In an aspect, the invention provides the use of the tracking system for reducing the occurrence of accidents in professional environments and/or for increasing the safety in professional environments.
In an aspect, the invention provides the use of the tracking system of the invention for displaying parameters determined by the tracking system of the invention. Preferably, said parameters are displayed in real time. In an aspect, the invention provides a method for monitoring, tracing and/or tracking an object and/or individual, the method comprising the step of providing the tracking system of the invention and monitoring, tracing and/or tracking an object and/or individual with said system. In an aspect, the invention provides a method for collecting information and/or determining one or more parameters of an object and/or individual during an event selected from the group of: professional activity, a professional and/or security or safety training event, a sports event and/or training event in general. In an embodiment, the method comprises the steps of: producing an image and/or a sequence of images of said object and/or individual; computing, from said image(s), said information and/or parameter(s).
Most of the innovations presented herein, contrary to state of the art solutions, are advantageous in that they do not put any constraint on the entity, the latter being observed from a distance and without the need to carry a distinctive, active or passive element. In order to do so, cameras that are sensitive to the visible light but also in the infrared and/or ultraviolet spectrum are used. The combination of cameras sensitive to different wavelength ranges, for example standard cameras, infrared camera, ultraviolet cameras and/or multispectral or hyperspectral cameras, increases the reliability of parameters retrieved by processing and analysing the images. Such parameters are, for example, physical parameters, such as the trajectory, the direction of movement, the position, speed, number steps used for a certain distance, or physiological parameters such as respiratory rate, heartbeat rate, cardiac rhythm, temperature, respiration, respiratory rate, transpiration, and galvanic properties of the skin of one or more entities. The data obtained in this manner can be combined with data retrieved from locally installed microsystems that are connected to a central installation via wireless connection networks, such as optical or radio wave based means.
The problems mentioned above are addressed thanks to the tracking system of the invention, which is capable of measuring physical, physiological, and/or biometric parameters.
Further aspects and preferred embodiments of the invention are defined herein below and in the appended claims. Further features and advantages of the invention will become apparent to the skilled person from the description of the preferred embodiments given below.
Brief Description of the Drawings
Figure 1 schematically represents a tracking system in accordance with an embodiment of the invention, using two video cameras, one sensitive to infrared light allowing quick and reliable recognition of a human horse rider, and the other camera sensitive to visible light, allowing a detailed analysis of the rider or his action.
Figure 2 schematically shows in greater detail a data processing unit of an embodiment of the invention.
Figures 3a and 3b schematically represent a tracking system comprising a plurality of cameras, which have different fields of view, in order to increase the monitored area (Figure 3a) or in case of zones covered by several cameras, for increasing the degree or level of observability and/or to create different angles and/or viewpoints (Figure 3b).
Figure 4 schematically shows a tracking system in accordance with an embodiment of the invention, the system comprising a hyperspectral camera the spectral resolution of which in the range from 700 to HOOnm is at least 20nm. This type of camera in combination with an appropriate data processing unit, is capable of distinguishing the respiration rate of a living subject.
Figure 5 schematically illustrates an image acquisition system in accordance with an embodiment of the invention, comprising a tracking system adapted to follow the trajectory of a horse and to determine information regarding the number of steps/strides and/or length of steps/strides. The information is displayed on a screen.
Figure 6 schematically illustrates the generation of data and its storage and display in accordance with an embodiment of the invention. In this embodiment, several moving objects individual are identified, so that the acquired parameters can be used to make comparisons between the different individuals.
Figure 7 illustrates a system for tracking motorized forklift trucks in accordance with an embodiment of the invention. The forklift is equipped with a tag that can be easily identified by the image treatment/analysis system. The analysis of the images generates data that can be compared to rules and which can thereafter be stored in an encrypted database that can be consulted by a user for statistical purposes. If data protection conditions allow, the user may get access to a specific image associated with the result of the statistical data analysis. Figures 8 A to A C shows images taken with an infrared camera, allowing detecting if individuals wear safety glasses.
Figures 9 A to 9 C are technical drawings corresponding to the images shown in Figs 8 A- 8 C.
Figure 10 shows an image taken by a visible light (RGB) camera of a tracking system of the invention configured to detect whether human operators carry hardhats or not.
Figure 11 is a technical drawing corresponding to the image of Fig. 10.
Figure 12 A and 12 B show thermal pictures generated processed and/or analysed in accordance with an embodiment of the invention. The event of exhalation can be recognized on Figure 12 A and can be seen to be absent in Figure 12 B. By determining the duration of one or more inhalation-exhalation cycles, the data processing system can determine the respiratory rate of an individual.
Figure 13 A shows stacked image-data captured by a hyperspectral camera over a period of time, in accordance with an embodiment of the invention. The image data is captured in the form of a line shown horizontally in the figure, with the first image/line captured being represented on top of the figure.
Figure 13 B shows the outcome of the analysis of the image-date generated in Figure 13 A. From changes of light intensity, repeating patterns become apparent, which the data processing unit, in particular an analysing unit, can use to determine the heart rate of the individual from which the image date was created.
Figure 14 is a photograph taken with an RGB (visual light) camera, calibrated in accordance with an embodiment of the invention, so as to contain a grid or coordinate system representing the position on the ground.
Figure 15 is a device allowing rapid geometric correction and/or calibration of a camera used in the system of an embodiment of the invention, the calibration device comprising light sources connected to each other via flexible connections having a defined distance.
Figure 16 shows an image captured by an infrared camera, allowing rapid detection of a target entity, which may be used for assisting the localisation algorithms analysing an RGB image. Detailed Description of the Preferred Embodiments
Figure 1 shows an embodiment of a tracking system 1 of the present invention. The system comprises one or more cameras 2, 3. There is a first camera that is sensitive to visible light 2, and a second camera, which is an infrared camera 3. The cameras may take still and/or single images, but preferably one or both of the cameras are adapted to take successive images, in particular videos and/or films. The system further comprises one or more data processing units 5, which is/are capable of analyzing the images as will be described further below. The data processing unit preferably is an image data processing unit. The system of the invention further comprises one or more output or display units 10, 11. In some embodiments, the output unit is used for displaying the parameters and data determined by the data processing unit(s) 5. In other embodiments, the output unit may be for taking safety measures, such as producing warnings with respect to the occurrence of a risk as determined by the tracking system of the invention. The warning may be a sound-related warning, in which case the output unit preferably comprises a loudspeaker. The warning may also be displayed on a screen or transmitted via an electronic message, such as an email or sms. Other safety measures are disclosed elsewhere in this specification.
In a preferred embodiment, the tracking system is configured to analyze at least one physical, physiological and/or biometric parameter associated with said entity 13.
The cameras 2 and/or 3 are adapted to monitor a determined geographical area of interest 12. Events taking place in the area of interest 12 are the subject of measurements and parameter determination in accordance with the invention. The area of interest 12 may be any area for which data as reported herein may be of interest. The area of interest 12 may be a working area and/or an area where professional activity is taking place. The area of interest 12 may also be an area where a sports activity takes place, for example a tournament and/or a competition. The system of the invention is preferably adapted to produce data with respect to an entity 13 moving within the area of interest.
In an embodiment, the system of the invention may be used for monitoring professional activities, training and/or security or safety events or exercises, military training events and/or a sports event. Security and safety training events include, for example, training events of the fire brigade, police, ambulance, outpatient departments, and/or training events of security staff. For example, the tracking system of the invention may be used to monitor an exercise of the fire brigade. The area of interest is preferably selected so as to cover the entire training event and/or a relevant part of the event.
In an embodiment, the sports event is selected from a competition, for example a equitation event, for example a horse race and/or a showjumping event, a team sport event, such as a football match, a basketball match, a handball match, a baseball match, an American football match, a rugby match, or a racket sports event, such as a tennis match, a badminton match, for example. The entity 13 may be an object, for example a vehicle, or may be a living individual, for example a human or animal subject. In Figure 1, the moving entity 13 is a horse ridden by a horse rider. Further shown in Figure 1 is an accessory object 18, which here is an obstacle in a horse riding competition. The accessory object 18 may function as a reference object or point, as described elsewhere in this specification. In some embodiments, the tracking system 1 is adapted and/or configured to generate one or more physical, physiological and/or biometric parameter of the entity 13. Of course, physiological and/or biometric parameters are assessed only with respect to living entities, in particular a human or animal having the respective parameter. As the skilled person will also understand, certain physical parameters determined can only be determined with respect to moving entities, such as speed, direction of movement, and the trajectory, for example. In an embodiment, the tracking system of the invention is configured to determine a trajectory 16 of said moving object or individual in said geographical area of interest 12, for example. These latter parameters may be determined with respect to living entities as well as moving (nonliving) objects.
Examples of physiological parameters are respiration, heart rate, transpiration, and body temperature, for example. In a preferred embodiment, the physiological parameter is a health parameter and/or is associated with the health or health status of an individual. For this reason, the physiological parameter can be used for establishing safety rules.
Examples of biometric parameters are parameters related to the surface of the entity. For example, in case of humans or animal entities, skin color, hair, such as hair color, and color and/or paint.
The camera 2, 3 is preferably selected so as to be useful in the image analysis as foreseen in the system of the invention. Figure 2 shows in greater detail the one or more data processing unit 5. The one or more data processing unit may be in the form of one or more computers. In this regard, the word "data processing unit" includes the plural form. The data processing unit 5 as shown in Figure 2 comprises a detection unit 6, which is capable of detecting the presence of a moving entity of interest 13 in the images captured by camera 2. It is noted that the expressions "capable of" and "is configured to", in the context of the data processing unit, includes and/or refers to "is programmed to" and/or "contains software that is capable of" and/or "runs algorithms capable of". When detecting a moving entity of interest, the detection unit 6 is preferably capable of distinguishing moving entities that are not of interest from those of interest, for example a movement of a spectator or an umpire from the movement of a horse to be tracked. Once a moving entity of interest is detected by detection unit 6, the tracking unit 7 is activated. The tracking unit 7 tracks the moving entity 13 on the successive images of the film. In so doing, the moving entity is also identified, as identification takes place along with tracking. Accordingly, part of the tracking unit 7 may comprise an identification unit 8. The identification unit is configured to identify an entity 13. For example, the entity 13 is identified as a human operator or a particular vehicle and the like.
In an embodiment, said one or more data processing units comprise a detection unit 6, adapted to detect said entity 13 within said image 4, a tracking unit 7 adapted to track the detected entity 13 on the successive images taken by the camera 2, 3, and an analyzing unit 9, adapted to determine a parameter related to said object or individual 13 tracked by tracking system.
In an embodiment, the tracking system of the invention is configured to identify said unit 13 by methods and/or analysis selected from the group of: ballistic analysis (speed, movement pattern), by form recognition, by identification of a visual or magnetic tag (19), by color analysis, and by analysis of the surface structure. The unit 13 may be recognized by a combination comprising two or more of said analysis and/or methods. The identification is preferably accomplished by said identification unit 8.
The cameras 2, 3 of the system of the invention are preferably selected in accordance with the algorithms used for image analysis and/or which allows for identification of the unit 13 or of the compliance with safety rules while simplifying the complexity of the Wgorithms|[Si]. In an embodiment, the system of the invention one camera selected from the group of: visible light cameras; cameras that are sensitive to infrared light and/or that captures infrared light, such as cameras selected from infrared (IR), near infrared (NIR) cameras and thermal cameras; time-of-flight cameras; short bandwidth cameras, UV cameras, and other cameras as specified elsewhere in this specification. The invention encompasses the use of two or more different or identical cameras. In an embodiment, the tracking system comprises a camera that is capable of capturing ultraviolet (UV) light.
The identification unit 8 is capable, for example, of recognizing a tag 19 that may be provided on the moving entity 13 in order to facilitate identification of the moving entity. The identification unit is also capable of identifying moving entities, for example persons, and to identify objects, such as safety equipment, for example.
Data produced by the tracking unit 7 is received by an analyzing unit 9. The analyzing unit 9 contains algorithms and computer programs and is capable of determining various parameters or situations, in particular physical, physiological and biometric parameters of the moving entity 13 tracked with the tracking unit 7, or the occurrence of a situation that is associated with a risk, such as a risk of an accident, for example. Figure 2 also shows the displaying units 10 and 11, on which data calculated by the analyzing unit 9 is presented in an appropriate form to be understood by an observer. The invention takes into account that in dependence of the observer (medicinal staff, umpire, spectator), different data may be helpful or necessary, which is why there are different display units 10, 11. Furthermore and/or alternatively, one display unit 10 may be used to display data in real time, whereas the other display unit 11 may be used to display recapitulative information, for example information covering a distance, a time period, and/or an entire performance of a moving entity. For example, the display unit 11 may be used to display average values (average speed, etc), or trajectories. In other embodiments, the display unit is used to produce a safety measure with respect to the occurrence of a risk or otherwise undesired situation.
The detection unit 6, the tracking unit 7, the identification unit 8 and/or the analyzing unit 9 may be separate physical entities, for example one or more computers or hardware components, or they may be in the form of software components. In an embodiment, the data processing unit 5 is one or more computers comprising software and/or hardware components functioning as one or more selected from the group of detection, tracking, identification and/or analyzing unit 6, 7, 8, 9.
The tracking system 1 of the embodiment shown in Figure 3 A comprises a plurality of cameras 2.1, 2.2, 2.3, which observe adjacent and slightly overlapping areas of interest 2.1, 2.2, 2.3. In this embodiment, the moving entity 13 can be tracked over a larger overall area, the larger area being the sum and/or combination of the separate areas 12.1, 12.2, 12.3. Preferably, the cameras 2.1, 2.2, 2.3 are synchronized so as to allow continuous tracking of the moving entity 13.
In an embodiment, the tracking system of the invention comprises a plurality of cameras are adapted to observe and/or produce images of a continuous geographical area 12.1, 12.2, 12.3 formed by said adjacent and/or overlapping geographical areas 12.1, 12.2, 12.3.
In an embodiment, the tracking system of the invention comprises a plurality of cameras, wherein said cameras are positioned so as to increase and/or optimize the observability of said object and/or individual 13.
In an embodiment, the tracking system of the invention comprises a plurality of cameras, 2.1, 2.2, 2.3, so as to allow the detection of an object and/or individual 13 and/or of a tag 19 placed on said object and/or individual 13 even if said object, individual 13 and/or tag 19 cannot be detected from one of said cameras 2.1, for example due to an unfavorable orientation of said object, individual 13 and/or tag 19 with respect to said camera 2.1.
Figure 3 B shows a tracking system according to an embodiment of the invention, wherein two or more separate cameras 2.1, 2.2 are used to observe a defined geographic area of interest 12.1. The plurality of cameras 2.1, 2.2 observe the area 12.1 from different positions and/or angles. The availability of differently positioned cameras observing a defined geographic area 12.1 improves the level of observability, increasing the probability that at least one camera captures images that can be used for generating and/or retrieving data and parameters as defined in this specification. Similar to Figure 3 A, Figure 3 B shows that adjacent and/or slightly overlapping geographic areas 12.1-12.3 are observed, each area by a pair of cameras, 2.1 and 2.2; 2.3 and 2.4, and 2.5 and 2.6, respectively.
Two cameras monitoring (at least part of) a given zone may be of the same type, for example two visible light cameras, or may be of two different types, for example: (1) a visible light and an infrared camera, (2) a visible light and hyperspectral and/or multispectral camera, and, (3) and infrared and a hyperspectral and/or multispectral camera, for example.
In an embodiment, the tracking system of the invention comprises at least two different cameras, a first camera 2 and a second camera 3, wherein said first and second cameras are sensitive to and/or capture light of different wavelengths and/or wavelength ranges. In an embodiment, the tracking system of the invention comprises at least two different cameras 2, 3, a first camera 2 and a second camera 3. Preferably, said first camera 2 is capable of capturing visible light and said second 3 camera is capable of capturing infrared light. Figure 4 illustrates an embodiment of the tracking system during operation. In the image 4 a reference zone or zone of analysis 14 is detected and tracked by the image processing and/or analyzing software. In this reference zone 14, image analysis is performed, for example in order to determine the heart rate. Heart rate is generally expressed as the number of heart beats per time interval, for example heart beats per minute (bpm). On the display 12, speed and heart rate of the horse (in this case 25 km/h and 67bpm) are displayed in real time. Camera 2 may be a visual light camera and camera 3 may be an infrared camera, guided and/or controlled by the data processing unit to capture a close-up of image 30, in which the head of the horse is detected and tracked, so as to obtain an image containing more image data of the reference zone 14.
Figure 5 schematically illustrates the determination of the number of steps conducted by the moving entity 13, here a horse, between two obstacles 18.1 and 18.2, which may also function as fixed reference points in the geographic area of interest 12. In the image 4 taken by, for example, camera 2, a zone of analysis or reference zone 14 is identified, tracked and analyzed. Suitable algorithms are applied in order to determine the number of steps/strides conducted by the moving entity, between the obstacles 18.1 and 18.2. The distance between the obstacles 18.1 and 18.2 can serve as a reference distance, and when the moving entity 13 passes the distance, the speed of the entity 13 when passing the reference distance can be determined. On the screen 12, the number of steps used to cover the reference distance may be displayed. The number of steps/strides may be determined by the analysis unit 9, for example from the analysis of the movement, which may determine the time interval of the movement and/or the contact points 32 with the ground surface, for example. Figure 6 schematically illustrates the tracking of a plurality of moving individuals within the geographic area observed by the tracking system. In the embodiment shown, the tracking system is used to monitor a horse race with several competitors 13.1, 13.2 and 13.3 running simultaneously. Parameters determined by the data processing unit 5 are displayed on screens 10 and 11. Screen 10 shows an overview image indicating the trajectory of the moving individuals in the area of interest 12 as they advance. In an embodiment, screen 10 is a first screen and is accessible to spectators and/or the public of a sports event, for example. Screen 11 shows more detail, including physiological data, such as heart rate ("Heart"), respiratory rate ("Breath", the body temperature ("Temp.") and transpiration ("Skin"). In an embodiment, screen 11 is available to an umpire and/or to a medical and/or veterinary staff. The data shown on screen 11 allows the appropriate person to assess the health status of the moving individual, for example the horse. The parameters allow an assessment of the risk of an adverse event, such as exhaustion, accident, and/or injury, for example. If the risk of an adverse event exceeds a predetermined threshold value, the tracking system of the invention produces a safety measure, such as a visible and/or audible warning, for example. To this end, the system comprises at least one safety rule, which allows it to determine if the health of an individual is in danger. The warning may call a participant individual to stop the race, or a visible warning may be displayed on a screen. In this way, the participant individual may be forced to interrupt the race and/or be withdrawn from the competition, or any other measure may be taken to avoid an accident and/or harm.
The trajectory of a moving entity may be determined from the positions of the moving entity on successive images taken by the camera. In an embodiment of the tracking system, said data processing unit 5 is adapted to determine a trajectory 16 of an entity 13 by determining and/or storing successive positions of said entity on the geographical area of interest 12.
By extrapolation of the trajectory, an extrapolated trajectory may be obtained, indicating future positions of the moving entity. Taking also speed of the moving entity into account, the future positions of a moving entity in dependence of time within the area of interest may be determined and/or calculated. The tracking system of the invention preferably comprises safety rules that take the extrapolated trajectory and/or future positions at particular points in time of a moving entity into account. For example, from extrapolated trajectories, direction of movement and/or speed of two or more moving entities, the system may determine if the two or more entities are in collision course with each other. In an embodiment, the system comprises a safety rule based on the presence of a collision course between two moving entities, one moving entity and a stationary object or a moving entity and an entity that can move but does not move, for example because it is or has stopped from moving. The system of the invention may also calculate the time remaining until a calculated or projected collision and may take the remaining time into account as part of the safety rule. For example, the system may produce a safety measure if an entity is predetermined or projected to enter in collision within 1 minute or less, for example 30, 20, 15, 10 or 5 seconds or less based on parameters such as the extrapolated trajectory, direction of movement and speed of one or more moving entities, for example. In this regard, it is noted that in equestrian competitions, detrimental exhaustion that can even lead to death are a widespread problem that the present invention alleviates.
Figure 7 illustrates the use of the tracking system of the invention in a professional environment, for example a warehouse, in which motorized vehicles 13.1, 13.2 circulate. In this case, the geographic area of interest 12 covers at least part of the area in which the vehicles 13.1 and 13.2 - here forklift trucks - operate. Although not shown in Figure 7, the invention envisages situations in which one or more pedestrians are present, in addition to vehicles. The combined occurrence of vehicles and person advancing by foot in a given geographical area, makes the use of the system of the invention even more necessary and advantageous.
A "professional environment" or an area covering a professional activity, for the purpose of the present invention, preferably concerns environments where human operators execute manual work and/or operate moving objects, engines that have moving components, and/or vehicles. Typical professional environments are building, construction and industry sites, warehouses, manufacturing sites, factories, mills, plants, sites of packaging of goods, sites where goods are transported and/or exchanged, sites where goods are filled and/or transported, and the like.
The image 4 generated by camera 2, for example, allows the analysing unit to determine physical parameters, such as the position, orientations and/or speeds of the pedestrians and/or vehicles. The data processing unit 5 is adapted to evaluate the data created by the image analysis unit to determine the risk of an accident. In case the tracking system detects a dangerous situation, a warning may be produced, which is preferably output, for example by way of an acoustic signal or on a screen 11. Data are preferably stored on a memory, for example in a database 15, which allows statistical or other analysis, for example in order to improve the working process and/or behaviour of the drivers. The data processing system 5 may also measure sound signals, which may be used to determine the occurrence of malfunctioning devices or vehicles, impacts and/or collisions.
In an embodiment, the data processing unit 5 is configured to detect, identify and track entities 13.1, 13.2 appearing in the area of interest 12 and determine relevant parameters such as position, orientation speed, and compare the data produced with rules, in particular safety rules. The data processing unit 5 may then check if conditions of the rules are met and, if so, triggers the appropriate signal, warning or other action. Examples of rules are, for example, the pedestrian-vehicle distance, the loaded vehicle (forklift) direction, the presence of objects in a pathway/direction of a vehicle, the distance between vehicles. For example, in case the data processing unit detects that a distance between a pedestrian and a vehicle is smaller than a reference distance defined by the rule, a warning (signal) is produced, and/or the working process is interrupted, for example.
In an embodiment, the tracking system of the invention comprises one or more safety rules, preferably a set of several safety rules. The safety rules are preferably part of the analyzing unit 9. The tracking system is preferably programmed to contain said safety rules. The safety rules are preferably associated with the physical, physiological and/or bio metric parameters. For example, the safety rules determine threshold values, which, when met surpassed and/or undercut, indicate the presence of an undesired situation, such as an increased risk of accident, for example. The system of the invention is preferably configured to determine whether there is a breach of any safety rule, and, in case of breach, produce a safety measure.
Exemplary safety rules are values and parameters selected form the group of: the speed of a moving entity, the distance between two different entities 13.1, 13.2, if at least one of the entities is moving, the change of the distance between two entities, the driving sense of a moving entity 13 (forwards or backwards), the presence and volume of a charge on a charged moving entity, the presence or absence of safety equipment and the like.
For example, the size or volume of a charge on a vehicle, such as a forklift, may have an impact on the overview of the driver. If the load is too large, the tracking system may detect the violation of a safety rule and take a safety measure.
The safety rules are preferably monitored continuously and in real time by way of algorithms contained in the analyzing unit 9. In an embodiment, the tracking system is configured to detect or determine, by comparing said parameter with said safety rule, the occurrence from an undesired situation selected from one or more of the group of: (1) a deviation from safety rules, (2) an increased risk of accident, and, (3) potentially harmful events.
In an embodiment, the tracking system of the invention is configured to detect a breach of a safety rule from one or more parameters selected from the group of: position, the orientation, the direction, the speed the entity 13.1, 13.2; the distance between two different entities 13.1, 13.2, the distance between a moving entity and a stationary entity or object, and the direction of movement between a moving entity and a stationary object or entity, and combinations of the aforementioned.
In case of physiological parameters and/or parameters associated with the health in case of a living entity, said system preferably comprises safety rules for determining the presence of ta risk for the health of the entity 13.
The entity of interest 13 may comprise a tag 19, as shown on the vehicles 13.1, 13.2 in Figure 7. The tags 19 facilitate identification and tracking by the tracking unit 7 and identification unit 8 (Fig. 2). It is noted that depending on the angle of view of the moving object with respect to the camera, the tracking of a defined moving object of interest may be difficult, because of the change of the outline of the object in dependence of perspective. In these cases, the use of a tag allows rapid and reliable identification and/or tracking. In other embodiments, a tag is not necessary and may thus be absent, because the camera is capable of reliably identifying a particular object without the need of a tag. The tracking system of the invention is in particular useful for avoiding accidents in stressful or hectic situations, in moments of increased workload or increased professional activity and/or in moments of increased pressure, such as time pressure, or pressure on the staff in general. It has been observed that in such situations, operators tend to neglect or forget safety rules, which is one reason why accidents in a professional environment occur more frequently in such situations.
Figures 8 A through 8 C and 9 A- 9 C illustrate another embodiment of the invention related to the reduction of risks of accidents occurring in a professional environment and/or improving safety during professional activity. Figs 8 A-8 C are original image data (photographs) which may be analyzed by the tracking system 1 of the invention, whereas Figs 9 A-9C illustrate similar images by way of drawings. Most reference numerals are inserted in Figs. 9 A-C only.
In Figs 8 A-9 C, the moving entity 13 (here: individuals 13.1, 13.2, 13.3 in Figs. 8 A-9C, respectively) is a human individual. The area of interest 12 may be a professional workplace as shown in Fig. 7. Alternatively, the camera of the tracking system may monitor an access to the working place, so that each human operator has to pass in front of a more restricted area covered by the camera. The access may be a door or a corridor by which the operators 13 access to a workplace, for example. The tracking system 1 is configured to identify the head 31 of the individual so as to determine the reference zone 14. The system then checks if safety equipment 21 is present in or close to the reference zone 14. If desired and/or useful, the system 1 may be configured to identify a more restricted, second reference zone 14.1 within a larger, first reference zone 14. For example, in a first step the head of the moving operator 13(13.1-13.3) is identified for defining a first reference zone 14, and, in a subsequent step, the eyes of the operator are identified so as to define a second, smaller reference zone. The image analysis is then conducted within the second reference zone. In the embodiment shown, the system is configured to identify the zone of the eyes of the individual within the first reference zone 14, so as to determine the postion of the second reference zone 14.2.
In the example shown, the safety equipment are safety glasses 21, which the tracking systems seeks to identify within the reference zone, in particular at the position of the eyes of the operator.
In an embodiment, the tracking system is configured to determine a parameter, said parameter allowing the detection of the occurrence of an undesired situation. Preferably, the system of the invention uses said parameter for detecting the occurrence of an undesired situation, such as the breach of a safety rule. In an embodiment, the undesired situation is selected from one or more of the group of: (1) a deviation from safety rules, (2) an increased risk of accident, and, (3) other potentially harmful events. In an embodiment, the tracking system detects any one of (1), (2) or (3) in a professional environment.
In an embodiment, said parameter is related to the presence or absence of a safety equipment object 21 in said reference zone 14. In an embodiment, the safety equipment is selected from the group consisting of: safety glasses, a hardhat, a safety helmet, gloves, shoes, a life vest, and high visibility clothing and combinations comprising two or more of the aforementioned. In an embodiment, the safety equipment comprises a particular tag, or a material reflecting light, such as light of a particular wavelength, for example. The particular light reflected by the safety equipment may be detected by the camera 2, 3. The tracking system of the invention may comprise a source of light. In an embodiment, the tracking system of the invention further comprises a light source 20 capable of emitting light, and wherein said data processing unit 5 is adapted to determine, from the light reflected from the surface of said object and/or individual 13, a parameter with respect to said individual 13. For example, a life vest or any other safety equipment may comprise a material or surface reflecting a particular light, for example IR or NIR light. Preferably, the camera of the tracking system is selected so as to capture the light of the particular wavelength as reflected by the material or surface, for example a IR, NIR or thermal camera in case of a material or surface reflecting IR or NIR light. The light source is preferably selected so as to produce light that is reflected by the material or surface, such as IR or NIR light source in case of an IR or NIR reflecting surface. The use of a light source may thus facilitate the identification of an entity and/or the presence of safety equipment.
In an embodiment, the tracking system of the invention is configured to produce a safety measure if an undesired situation is detected. In this manner, the tracking system is used for assuring the compliance with and/or the implementation of safety rules, such as the wearing of safety equipment during work.
In Figure 8 A, safety glasses are absent in the reference zone 14, and the tracking system of the invention will take a safety measure, such as the production of a warning. In an embodiment, the safety measure is the production of a visible or audible signal or message via said output unit 10, 11.
In an embodiment, the safety measure may be the sending of a message to a telephone or a computer. For example, the system of the invention may be configured to send an email or sms to the mobile phone or smart phone of an individual concerned by the breach of a safety rule. In an embodiment, the safety measure may be the sending of an alert message to an address, email address or phone number of an individual that conducted the breach of a safety rule, for example, and/or to an individual otherwise concerned by the breach of the safety rule, for example to the individual that suffers an increased risk or undesired situation due to the breach of a safety rule.
In an embodiment, the output unit comprises an alarm notification appliance, such as a visible alarm signal entity, such as an alarm light, or an audible alarm entity, or a combination of both.
For example, if the image of Fig. 8 A is taken at the entrance of a working area, the output entity may comprise a screen indicating a warning, reminding the operator of the fact that he/she does not comply with safety rules at the working space. The screen is preferably placed so as to be easily visible to the operator 13, for example placed at eye's height next to the entrance to the working area. Alternatively, or in addition, an audible warning may be produced. Audible warnings or may include warning tones or a computer voice pronouncing the appropriate warning. In case the tracking system is capable of identifying the individual operator 13, the warning may be made with reference to the name of the operator. For example, the computer voice may directly call the operator's name. In other embodiments, the safety measure directly interrupts the process that is associated with the breach of safety rule or the increased risk of accident or other potentially harmful event. In this manner, the tracking system may be used to prevent the harmful event directly. In order to do so, the tracking system is preferably connected with a physical entity and acts on a physical device. For example, as a safety measure, the tracking system may block access to the working space for the operator, for example, by blocking a door and the like, or may stop vehicles by remotely controlling such a vehicle. In this case, the output entity comprises suitable equipment to send a signal that is received by the vehicle and that stops the vehicle actively. The tracking system may also inform another central processing unit of the increased risk of an accident, said other data processing unit being able to control vehicles, moving objects or other machines having moving components.
The invention encompasses that the tracking system produces more than one safety measures, for example a combination of two or more safety measures as set out in this specification or other safety measures. Safety measures, for the purpose of this specification, encompass any measure that has the purpose of undoing or reducing the occurrence of an undesired situation, of an increased risk of accident, or other harmful events, or which directly provide rapid assistance or help in case of the occurrence of an undesired event or situation, such as an accident.
Figure 8 B illustrates the situation where the tracking system of the invention identifies the safety equipment and does not produce any warning, because the operator 13 complies with the safety rules. The tracking system may also produce a positive message in such a case, such as a smiley displayed on a screen or an encouraging audible message. Preferably, the algorithms of the tracking system of the invention are capable of distinguishing usual gear from safety equipment. As shown in Fig. 8 C, the tracking system is configured to recognize the appearance of normal glasses 24 (for example, for correcting vision) in the area of interest 14 and to determine that the normal glasses are different from the safety glasses. Therefore, a warning or safety measure is preferably also produced in the case of Fig. 12 C.
In the embodiments illustrated in Figures 10 and 11, the safety equipment is a helmet 21, 21.1. Fig. 10 is an image 4 taken by an RGB camera at the entrance to a working space. Fig. 11 reproduces the image 4 of Fig. 10 as a drawing for better illustration with reference numbers. The tracking system of the invention is configured to identify moving human operators 13.1 and 13.2, and to identify the respective area of interest 14.1 and 14.2 associated with each human operator, each operators being a "moving entity 13" in accordance with the invention. The tracking system is configured to analyze the area of interest 14.1, 14.2 (here: the head of the operator) and determines the presence or absence of a safety equipment 21 (here: a hardhat) in the areas of interest 14.1, 14.2. In the case of operator 13.1, no hardhat is detected in the reference zone 14.1, and a warning or safety measure is produced, for example a visible or audible warning as exemplified elsewhere in this specification. In the case of operator 13.2, a hardhat 21 is detected within the respective reference zone 14.2, and no warning is produced with respect to operator 13.2. It is noted that the analyzing unit may also detect the presence of hardhat 21.1, which is carried in the hand by operator 13.1. However, since the hardhat 21.1 is not present outside the reference zone, a safety measure is still produced. In the embodiment shown, hardhat 21 carries a tag 19 for facilitated recognition and hardhat 21.1 does not carry such a tag. The invention can be performed with or without the use of such a tag. The algorithm of image analysis has to be adjusted with respect to whether or not a tag is used. The use of a tag may facilitate the algorithms, because the same recognition pattern (tag) can be used with respect to different situations. In an embodiment, the safety equipment 21 comprises a tag 19, which allows the analyzing unit of the tracking system to identify the presence of the safety equipment rapidly. The tag 21 may facilitate the detection of the safety equipment in the reference zone 14. In other embodiments, the analyzing unit is configured to identify a particular safety equipment 21 (helmet, safety glass, etc), even without the need of a tag. Whether or not a tag is used may depend on factors such as the area covered by a single camera, the quality of the camera and the quality of the algorithms used by the tracking system and in particular the identification unit 8 of the system (Fig. 2). For example, in the case of Fig. 10, a visible light sensitive camera is used for monitoring a relatively restricted area covering the access to a working space. Thanks to the relatively small area 12 (compare Fig. ) to be covered, the use of a tag on the hardhats 21 is optional, as the system may well recognize a helmet even without the use of a tag. It is further noted that some safety equipment, such as glasses, may provide little place for applying a tag, which is why in the embodiments shown in Figs 8 A through 9 C no tag was used for detecting the presence or absence of safety glasses.
In some embodiments, the system of the invention observes the occurrence of a deviation from safety rules. For example, the tracking system uses the one or more parameters for detecting the breach of a safety rule. Such safety rules may be the requirement for human operators to wear safety equipment or rules with respect to the handling and/or operation of vehicles and machines. For example, safety rules may comprise speed limitations, required distances between vehicles or between a vehicle and a stationary object or an operator. Furthermore, the system of the invention may detect the presence of an increased risk of an accident from direction of movement (trajectory or extrapolated trajectory) of a moving entity, speed of the entity and from the presence and/or behavior of other entities, such human operators or stationary objects. Safety measures are preferably takes as soon as a breach of a safety rule and/or an increased risk of accident is detected.
While the prior art reports the positioning of moving entities using localization devices carried by the entity, the tracking system of the present invention is operational on the basis of image-related data only, although the combination with sensors and the like may result in more exact and/or more reliable parameters and is also envisaged in some embodiments of the invention.
In a preferred embodiment, at least one camera that is used for the purpose of the present invention is pre-calibrated and/or installed before any moving entity is tracked. For example, once the camera is installed to observe a particular area of interest 12, deflection due to the lenses of the camera is determined and compensated by the data processing unit. Preferably, an algorithm is applied, which produces an image in which deflection has been corrected. In a second step, a grid pattern-based positioning system is applied to the image created by the camera, so that any given pixel/position in the image can be attributed to a position on the ground as shown in the image. This may be performed using led signals positioned at known distances and angles, for example in the corners of a square provided on the ground on the observed area (Figure 15). In accordance with the rules of projective geometry, the required determination can be accomplished. The ground visible in image 4 provided by the camera can now be represented as a coordinate system 17. Each pixel or position in an image captured by the camera can now be assigned to a position, which can be expressed in terms of values of the two axis of the coordinate system 17. In other words, any pixel of the image that falls in the coordinate system corresponds to a position. Image pixels that are in the region where there is no ground surface do not represent any position. This is the case, for example, in the top part of the image of Figure 14, where a wall is seen.
In an embodiment, said data processing unit 5 is adapted to associate a position on an image made by said one or more camera with a position on the ground of said geographical area 12. In an embodiment, said data processing unit 5 is adapted to determine a position on the ground of said geographical area 12 from a position of said entity in said image (4), and/or from the position of said reference zone 14.
In an embodiment, the geographical area of interest 12 is a substantially flat and/or even surface. If the area 12 comprises uneven parts, the data processing unit 5 may ignore them and consider the area to be even.
In an embodiment, said data processing unit 5 of the tracking system of the invention is adapted to calibrate an image 4 of said one or more camera by generating a coordinate system 1), wherein a position A in said coordinate system 17 represents a position on said geographical area of interest 12, wherein said data processing unit 5 is configured to associate said coordinate system 17 with said images 4 generated by said camera, and/or wherein any pixel of said image taken by said camera can be attributed to a position of said geographical area 12 on the basis of said coordinate system 16. In an embodiment, said image analysis unit 9 is adapted to determine the position where an entity 13 is in contact with the ground in said geographical area 12.
Figure 15 shows a device 33 that can be used for rapid geometric correction and/or calibration of a camera used in the system of an embodiment of the invention. The device comprises flexible bands or cords 35 of a determined length. Light sources 34 are provided in the four corners of the determined geometric shape of the spread and/or unfolded device, here a rectangle. In the device shown, here is also a light source in the center of the device, where the diagonals of the rectangle cross. Light sources 34 or assemblies of light sources of one or more wavelengths may be used, so that the camera to be calibrated detects the light source. In the device 33 shown, each light source is an assembly comprising two light sources, one emitting visible light (small empty circle in light source 34 in Fig. 15), the other infrared light (small filled in circle in light source 34 in Fig. 15). In this manner, the device 33 is preferably useful for calibrating cameras that are sensitive to different light wavelengths. For calibrating the camera, the device 33 is spread at different positions in the geographical area of interest 12, and the data processing unit 5 to geometrically calibrate the camera using the signals produced by the light sources 34. The data processing unit uses the known distances between the light sources to calibrate the camera. Figure 14 is an image 4 taken from a geographical area of interest 12, here the scene of a show-jumping course. The camera has been calibrated so that a coordination system or grid 17 can be overlapped with the image. The camera is preferably immobilized, positioned in a fixed manner, or the specific position and adjustment of the camera is determined, so that if an image is taken with this particular view, any pixel can be associated with a position on the ground. If the camera is motorized to be directed to another area of interest, the position for which it has been calibrated is preferably stored, so that the same position can be adjusted automatically at a later point in time, for example.
As becomes clear from the aforesaid, the data processing unit 5 of the tracking system 1 of an embodiment of the invention has the information required for determining the position of a detected entity in the geographical area of interest 12. First, the presence of a moving object of interest 13 is detected by the detection unit 6. Once tracking of the moving entity has started by unit 7, the analyzing unit 9 can determine the position of the tracked object. If the camera is positioned so as to yield a perspective view, the lowest point of a detected entity 13 on an image 4 may be used as the position of the entity in the area 12. In a perspective or front elevation view, the lowest point generally is the point of contact of the entity 13, for example a moving individual, with the ground, which is why the position of this point can be identified using the coordination system 17, for example. On the other hand, if a camera is positioned on top, so as to provide a top-down view on the geographical area of interest, the image analysis entity 9 preferably determines a center of the detected entity and uses the position of this center in the coordinating system to determine position of the entity in the geographical area.
The data processing system 5 and in particular the image analysis entity 9 can determine several parameters from the position of the entity at a given point in time. In particular, the trajectory and the speed may be determined. From the trajectory, it is possible to determine direction of movement and possibly the orientation of the moving entity.
In an embodiment, the invention encompasses skeletal tracing of a living, mammal individual moving in the area of interest 12. In an embodiment, the system of the invention may, for example a Kinect hardware component and/or commercially available SDK software development kits for skeletal tracking. Skeletal tracing may be used, for example, to determine the orientation of a living individual, and/or for determining the number of steps/strides performed by an individual, for example within a given distance or course.
In an embodiment, the invention encompasses determining the respiratory rate of a living individual in the geographic area of interest. Respiratory rate may be determined using one or more infrared and/or thermographic cameras, for example, in particular if the ambient temperature is sufficiently different from the body temperature of an individual.
In an embodiment, the system 1 on the invention comprises at least one infrared camera 3, for example in addition to a visible light camera 2 (Figure 1). In order to determine a zone of interest within an image taken by the infrared camera, a similar processing unit as disclosed in Figure 2 may be used. In particular, a tracking unit and/or identification unit may identify the head 31 of a moving individual and track the head and/or the area in front of the head, in the direction of the orientation of the head (Figures 12 A and 12 B). In Figures 12 A and 12 B, the zone of interest 14 that the image analysis unit preferably analyzes is in the center of the left half of the image. By determining the differences in the images over time, the cycles of exhalation (Figure 12 A) and inhalation (Figure 12 B) can be derived using a suitable algorithm, and the respiratory rate can be determined. It is noted that the images in Figure 12 A and 12 B were treated so as to show in black the pixels that exceed a threshold brightness value. In this manner, the identification of exhalation is facilitated.
If a thermal camera 3 is used for determining respiratory rate, the latter can may be provided at a fixed position, as described above, for example. In this case, the camera is preferably equipped with the required resolution and/or detection capacity, containing a large matrix of sensitive pixels, allowing the reliable analysis of a comparatively small area of the image taken by the camera. In another embodiment, a thermal camera 3 with a comparatively lower resolution and/or detection capacity may be used. For example, a motorized camera may be used, so that it can be directed to a particular zone of interest, where there is a moving entity 13. For example, the data processing unit 5 may drive the motors of the infrared camera 3 to keep it directed towards the moving entity 13. The data processing unit 5 may use the position of the moving entity as determined with the aid of the visible light camera 2 as described elsewhere in this specification. Alternatively, the infrared camera 3 may be connected to a data processing unit containing its own detection, tracking and analyzing units, so that the infrared camera is directed towards the moving entity on the basis of images obtained from the infrared camera itself.
If the infrared camera is tracked directed towards the moving entity 13 in real time, images can be produced continuously, at least as long as the moving object is followed. These images can be analyzed to determine the respiratory rate as described above. In an embodiment, the tracking system of the invention comprises at least one camera that is sensitive to infrared light and/or that captures infrared light. Preferably, the tracking system comprises a camera selected from infrared (IR) cameras, near infrared (NIR) cameras, and thermal cameras. In an embodiment of the invention, if the system 1 of the invention comprises two or more cameras, the two cameras may be directed to capture images from one or more selected from the group consisting of: (1) substantially the same geographical area of interest 12; (2) from at least partially different geographical areas of interest 12.1, 12.2, etc; (3) from a geographical area of interest 12 and from a (smaller) area within the geographical area of interest; (4) from a geographical area of interest and from an a view in the geographical area of interest, wherein the view may capture an image from an area that is partly or totally outside the area of interest, but which may capture an entity staying in the area of interest; and (5) from an area of interest 12 and an area that is not of interest, wherein the latter may be used for calibration purposes and/or for generating reference and/or comparative data, for example, reference data of one or more of the parameters described in this specification.
Figure 16 is an image taken by an infrared camera. Due to the high contrast between the living subject and the background obtained with this camera, the use of the infrared camera is particularly useful to rapidly identify a moving object within an area of interest. The tracking system of the invention may combine the use of an infrared camera with an RGB camra, for example, wherein the infrared camera is generally used to rapidly identify the moving entity, and the RGB camera is used in addition for image analysis, for example, being used for retrieving information from the reference zone, for example. Preferably, both cameras the infrared and the other camera used in combination with the infrared camera are synchronized.
The invention encompasses displaying the respiratory rate on a display 11 (Figure 4), optionally together with other physiological data, as illustrated in Figure 6.
In an aspect, the invention provides the determination of a heart rate of a tracked individual. Figures 13 A and 13 B illustrate the determination of the heart rate of an individual using a hyperspectral and/or multispectral camera, for example. Figure 13 A is produced using a hyperspectral camera directed to a position on a body, which here is the common carotid artery. One image is produced as a line across the zone of interest, and successively produced image-lines are arranged one below the other, so as to yield Figure 9A.
Figure 13 A shows the light intensities at specific wavelengths as determined by the hyperspectral camera. Fluctuations in intensity corresponding to the heart rate can be seen with some of the wavelengths. The white rectangle in Figure 13 A represents the zone of interest 14 that will be analyzed by the analyzing unit 9, for example. Using the data shown in Figure 13 B, the data processing unit 5 can determine the heart rate of the individual. In case of a moving object, care has to be taken to keep the camera directed to the zone of interest. As discussed above with respect to the determination of respiratory rate, it is envisaged to use a motorized camera, which is guided to remain oriented to the zone to be analyzed of the moving individual. The present invention envisages determining heart rate at a predetermined position, for example within the geographical area of interest 12. This solution is possible, for example, when it is known that an individual will pass a specific position. For example, in case of a show jumping event, the camera may be position to observe an area where the moving individual, for example the horse, is required and/or expected to slow down. In this case, heart rate may be determined at specific moments and/or when the individual passes specific, selected posts, and not necessarily over the entire course and/or trajectory. Accordingly, if camera 3 in Figures 1-3 is a hyperspectral camera, for example, it may not observe the same geographical area as the visible camera 2. More generally, the second camera 3 may be positioned or motorized so as to be directed towards a selected extract within the geographical zone of interest 12, for example.
In an embodiment, the tracking system of the invention comprises a plurality of cameras, 2.1, 2.2, 2.3, 2.4, 2.5, 2.6, wherein one or more of said cameras are adapted capture images from a geographical area of interest 12.1, or - from within the area 12.1 - from an extract or smaller area, and/or from different positions and/or angles.
The invention further encompasses determining transpiration of an individual using, for example, camera and image analysis tools. As with respect to other physiological parameters such as respiratory rate and heart rate, a camera may be positioned to cover the same geographical area 12 as the visible camera 1, or another zone, or an extract within the area 12. The camera used for determining transpiration may be a visible or hyperspectral camera, for example. In another embodiment, the tracking system comprises an IR, NIR or thermal camera, and wherein said analyzing unit 9 is adapted to determine, from an image of said individual obtained with said IR, NIR or thermal camera, a status of transpiration of said individual.
For example, transpiration of an individual may be determined from the extent of light reflection from the body of the individual. In this regard, it is envisaged to use a light source (not shown in figures), to facilitate detection of a parameter, such as a physiological parameter. For example, a light source is used that accentuates reflection in presence of transpiration. For example, a light source emitting polarized light may be used. The camera capturing the reflected light may be equipped with polarizing filters, so that light that does not originate from reflection of the polarized light can be removed for the purpose of analysis, for example. Alternatively, a light source favoring reflection may be used. It is possible to calibrate the system with respect to an individual, for example determining a reference value of "reflection" and/or transpiration, and then determine transpiration during a sports activity, such as a competition, for example. Similar to heart rate and respiratory rate, the invention encompasses tracking transpiration continuously, at specific time intervals and/or at specific positions, for example posts of a course where an individual is expected to pass. In an embodiment, the tracking system 1 of the invention comprises one or more light sources.
The invention encompasses the combination of image analysis technology, for example as described herein, with other tools of analysis, so as to render the determined parameters more reliable, more precise, for example, or to have redundant data that can be used for allowing control of the correct operation of the tracking system, for example. Other analysis may include other non-invasive analysis, such as sonar measurements, or the use of sensors placed by the moving entity. In an embodiment, the tracking system is adapted to generate redundant data with respect to a parameter of said entity 13, for example using the microsystem as specified elsewhere in this specification.
In an embodiment, the invention provides the measuring of sound and to synchronize the sonar measurements with the visually measurements and/or visually determined parameters. For example, the noise emission produced by the steps of a horse may be used to determine the number of steps during a time period or within a trajectory. The steps determined with sound measurements may be synchronized with the steps as determined from tracking with cameras and the data processing unit 5. In cases where the unit or one of its accessories (such as a ball) can carry a microsystem, a network based on wireless sensors is preferably used in combination with the contactless measures. A "microsystem" is a an assembly of electronic circuits assuming the function of a sensor and having access to a wireless transmission for allowing a real time use of the transmitted information. The combination of microsystems with the present tracking system is part of the invention. In an embodiment, the microsystem comprises comprises an electronic circuit comprising a sensor 26 and a transmission system and/or a transmitter 27, for transmitting data produced by said sensor. In an embodiment, the tracking system comprises one or more sensors 26 adapted to be carried on or by said entity 13, wherein said sensor is selected from any one or more of the group consisting of: heart rate sensors, galvanic skin resistance (GSR) sensors, inertial sensors, such as gyroscopes and/or accelerometers, position and orientation sensors, such as magnetometers and GPS sensors, sound sensors and microphones, distance meters, pressure transducers, transponders and/or transmitters allowing direction finding (DF) and/or triangulation, temperature sensors.
In an embodiment, the tracking system of the invention comprises a receiver unit 28, adapted to receive data transmitted by a transmitter 27 and/or microsystem 25 placed on said entity.
In an embodiment, the tracking system of the invention the tracking system of the invention is adapted to determine a given parameter of said entity 13 from image-related data of said one or more cameras 2, 3 and from data generated by a sensor 26 carried by said entity 13. In an embodiment, the tracking system is configured to determine simultaneously or nearly simultaneously, and independently, a given parameter from said image-related data and from said sensor-generated date, respectively. In this manner, the tracking system obtains the same parameter or comparable parameters from independent sources. The date may be considered redundant data, and may be used as a control. The system may comprise algorithms for transforming redundant data so as to render comparable the values related to a given parameter but determined in different ways. The system of the invention may comprise a safety rule which depends on the similarity of the redundant data related to a given parameter. For example, the breach of a safety rule may be found if the data obtained from different sources (for example: image related data and sensor-related data) but related to a given parameter diverges beyond a particular threshold value. One could also envisage the use of microsystems for calibrating the camera and/or automatically generating the coordination system, for example, as described with respect to Figure 14, so as to allow image based tracking of an object thereafter. In this case, a "manual" calibration by placing a device with light sources as shown in Figure 15 on different areas of the geographical area of interest could be omitted.
The present invention is different from the art in several aspects as specified below (these differences do not apply to all embodiments encompassed by the invention):
In order to allow rapid detections of faults, the combination of a test system of intermediate values of the algorithm can be consulted in real time in combination with the tracking system, for example: (1) with a single camera for simple cases, (2) with several cameras having complementary fields of vision in order to increase the surface of the observed zone, (3) with a stereo camera or a plurality of cameras observing the same zone form different viewpoints in order to monitor the actors in a complex environment.
Two or more types of cameras may be combined for tracking using properties of the subject, such as heat release or reflection of UV light in addition to information available from the visible light spectrum.
In some embodiments, the invention encompasses the combination of tracking and measuring physical, physiological and biometric parameters or characteristics. In an embodiment, the invention encompasses combining tracking with sensors or tags provided directly on the respective entity 13 on an object used by the entity, for example a golf club or polo mallet.
In an embodiment, the tracking system of the invention comprises a hyperspectral and/or multispectral camera. In accordance with this embodiment, the invention encompasses combining tracking with a hyperspectral or multispectral camera in order to retrieve physical, physiological and biometric parameters without exerting any impact or effect on the tracked subject 13, for example. In an embodiment, said one or more data processing unit 5, in particular said image analysis unit 9, is capable to and/or configured to detect, on a reference zone 14 on the skin of said individual, a change of light intensity. Preferably, said data processing unit 5, and in particular said image analysis unit 9, is adapted and/or configured to detect, from said light intensity, the presence of water and/or a transpiration of said individual.
In an embodiment, the invention encompasses combining a light source suitable to elicit biometric characteristics from the surface of the subject 13 (hair, skin, pelage (of an animal), paint). In an embodiment, the invention encompasses the exploitation of tags placed on the moving and/or tracked entity 13 for facilitating classification, ballistic measures and/or the identification of the entity 13.
In an embodiment, the invention encompasses the exploitation of tracking information for performing automated comparisons of trajectories and/or strategies, for example in order to understand and resolve logistic problems, or, during a sports event, tracking of a horse during a horse race.
In an embodiment, the invention encompasses measuring, in an automated manner, the position, orientation, direction of movement, speed and stride, for example of a horse during a horse race competition.
In an embodiment, the invention makes use of tracking information for determining the risk of an accident and offering a statistic on nearly- accident or almost-accidents of an actor, with or without access to the corresponding images.
In an embodiment of the invention, the tracking system is combined with sensors associated with a tool for charging contents of software or firmware in one or another organ of the system, and allowing in this manner an optimal functioning even in case of initial unforeseen events.
Reference numerals in the figures:
1 tracking system
2 camera (visible light) (2.1, 2.2) 3 camera (infrared) (3.1.3.2, 3.3)
4 image taken by camera (in case of different images to be shown in figures: 4.1, 4.2, 4.3)
5 data processing unit
6 detection unit
5 7 tracking unit
8 indentification unit
9 analyzing unit
10 output or display unit (screen, real time)
11 output or display unit (screen, trajectory)
10 12 geographical area of interest
12, 12.1, 12.2: adjacent areas of interest
13 (moving) entity, an object or a living individual
14 reference zone of moving object on image
15 memory unit of one or more data processing unit
15 16 trajectory of moving object or individual, e.g. defined with respect to ground floor
17 coordinate system on image
18 accessory object
19 tag placed on individual/object
20 light source
20 21 safety equipment, such as safety glasses, hardhat or safety helmet
22 safety equipment (here: helmet) outside the reference zone.
23 safety rules / criteria
24 usual glasses, normal gear, different from safety equipment.
25 microsystem carried on object/individual
25 26 sensor carried on object/individual in microsystem 25
27 transmitter in microsystem 25
28 receiver unit
29 reference object
30 image analysis area
30 31 head of individual
32 ground contact, for example with leg, foot, here: hooves of the horse.
33 device for camera calibration
34 light sources for camera calibration.
35 flexible bands or cords.

Claims

Claims
1. A tracking system (1) comprising one or more cameras (2, 3) adapted to take images (4), one or more data processing units (5), and one or more output or display units (10, 11), wherein said camera (2, 3) is adapted to observe a geographical area of interest (12), wherein said one or more data processing unit (5) is configured to detect an entity (13) in the area of interest (12), and to determine at least one physical, physiological and/or biometric parameter associated with said entity (13).
2. The tracking system of claim 1, wherein said entity (13) is moving in said geographical area of interest (12).
3. The tracking system of any one of the preceding claims, wherein said entity (13) is an object, for example a vehicle, and/or a living individual, for example a human or an animal.
4. The tracking system of any one of claims 1-3, wherein said one or more data processing unit (5) is further configured to identify a reference zone (14) in at least some of said images (4), wherein said reference zone (14) is on or associated with said entity (13), and to analyze the reference zone (14) in order to said parameter associated with said entity (13).
5. The tracking system (1) of any one of the preceding claims, wherein said one or more data processing units comprise a detection unit (6), adapted to detect said entity (13) within said image (4), a tracking unit (7) adapted to track the detected entity (13) on the successive images taken by the camera (2, 3), and an analyzing unit (9), adapted to determine a parameter related to said object or individual (13) tracked by tracking system.
6. The tracking system of any one of the preceding claims, wherein said tracking system (1) comprises a set of safety rules.
7. The tracking system of claim 6, wherein said tracking system is configured to detect or determine, by comparing said parameter with said safety rule, the occurrence from an undesired situation selected from one or more of the group of: (1) a deviation from safety rules, (2) an increased risk of accident, and, (3) potentially harmful events.
8. The tracking system of any one of the preceding claims, which is configured to produce a safety measure if an undesired situation is detected.
9. The tracking system of any one of claims 5-7, wherein said safety measure is selected 5 from the production of a visible or audible signal or message via said output unit (10, 11) or the sending of a message to a telephone or a computer.
10. The tracking system of any one of the preceding claims, wherein said parameter is the presence or absence of safety equipment object (21) in said reference zone (14).
10
11. The tracking system of claim 10, wherein said safety equipment is selected from the group consisting of: safety glasses, a hardhat, a safety helmet, gloves, shoes, a life vest, and high visibility clothing, and combinations comprising two or more of the aforementioned.
15 12. The tracking system of any one of the preceding claims, wherein said physiological parameter is a health parameter, and where said system comprises safety rules for determining the presence of ta risk for the health of the entity (13).
13. The tracking system of claim 12, wherein said physiological parameter or health 0 parameter is selected from the heart rate, respiration, respiratory rate, transpiration, and body temperature of said entity.
14. The tracking system (1) of any one of the preceding claims, which is configured to identify a unit (13) , wherein said unit is identified by one or more selected from the group: 5 ballistic analysis (speed, movement pattern), by form recognition, by identification of a visual or magnetic tag (19), by color analysis, and by analysis of the surface structure.
15. The tracking system (1) of any one of the preceding claims, which comprises at least one camera selected from the group of: visible light cameras; cameras that are sensitive to
30 infrared light and/or that captures infrared light, such as cameras selected from infrared (IR), near infrared (NIR) cameras and thermal cameras; time-of-flight cameras; short bandwidth cameras; and UV cameras.
16. The tracking system (1) of any one of the preceding claims, which is configured to 35 detect a breach of a safety rule from one or more parameters selected from the group of: the position, the orientation, the direction, the speed of the entity (13); the distance between two different entities (13), the distance between a moving entity and a stationary entity or object (18) , and the direction of movement between a moving entity (13) and a stationary object or entity (18), and combinations of the aforementioned.
17. The tracking system (1) of any one of the preceding claims, wherein said one or more 5 data processing units (5), and in particular said analyzing unit (9) is adapted to compute one or more parameters in real time.
18. The tracking system (1) of any one of the preceding claims, wherein said one or more data processing units (5) further comprises a database or memory unit (15), wherein said
10 data processing unit (5) is adapted to stock data produced from analyzing unit on said memory unit (13).
19. The tracking system (1) of any one of the preceding claims, wherein said data processing unit (5) is adapted to associate a position on an image made by said one or more
15 camera with a position on the ground of said geographical area (12).
20. The tracking system (1) of any one of the preceding claims, wherein said data processing unit (5) is adapted to determine a trajectory (16) of an entity (13) by determining and/or storing successive positions of said entity on the geographical area of interest (12).
20
21. The tracking system (1) of any one of the preceding claims comprising an infrared camera (3), wherein said data processing unit (5), and in particular said image analysis unit (9), is capable of detecting, on an image (4) taken by said infrared camera (3) the event of exhalation by an individual (13) and/or to differentiate an event of exhalation from an event
25 of inhalation or absence of exhalation, and/or wherein said data processing unit (5), in particular said analysis unit (9) is adapted to calculate a respiratory rate of said individual from the occurrence and/or recurrence of exhalation events over time.
22. The tracking system (1) of any one of the preceding claims, wherein said data 30 processing unit (5), and in particular said identification unit (8), is adapted to detect a tag
(19) placed on an entity (13, 13') expected to move within the geographical area of interest (12), and/or wherein said tracking unit is adapted to trace and/or track said tag (19) within the image and/or said geographic area (12).
35 23. The tracking system (1) of the preceding claim, wherein said data processing unit (5) is adapted to determine, from said tag (19), one or more selected from the group consisting of: the identity of the entity (13), the orientation of said entity (13) in said geographical area (12).
24. The tracking system (1) of any one of the preceding claims, further comprising a 5 light source (20) capable of emitting light, and wherein said data processing unit (5) is adapted to determine, from the light reflected from the surface of said object and/or individual (13), a parameter with respect to said individual (13).
25. The tracking system (1) of any one of the preceding claims, comprising a plurality of 10 cameras (2, 2.1, 2.2, 2.3), wherein said cameras are adapted to observe adjacent and/or overlapping geographical areas (12.1, 12.2, 12.3).
26. The tracking system of any one of the preceding claims, further comprising one or more sensors (26) adapted to be carried on or by said entity (13), wherein said sensor is
15 selected from any one or more of the group consisting of: heart rate sensors, galvanic skin resistance (GSR) sensors, inertial sensors, such as gyroscopes and/or accelerometers, position and orientation sensors, such as magnetometers and GPS sensors, sound sensors and microphones, distance meters, pressure transducers, transponders and/or transmitters allowing direction finding (DF) and/or triangulation, temperature sensors.
20
27. The tracking system of the preceding claim, which is configured to determine simultaneously or nearly simultaneously, and independently, a given parameter from said image-related data and from sensor-generated data, respectively.
25 28. Use of the tracking system of any one of the preceding claims, for reducing the occurrence of accidents in professional environments and/or for increasing the safety in professional environments.
29. Use of the tracking system of any one of the preceding claims, for assuring the 30 implementation of safety rules, such as the wearing of safety equipment during work.
30. Use of the tracking system (1) of any one of the preceding claims, for monitoring, tracing, tracking and/or displaying one or more selected from the group of: a professional activity, a security or safety training event and/or security and/or safety exercise, a sports
35 event, a and a military training event.
31. Use of the tracking system (1) of any one of the preceding claims, for assessing the risk of an accident involving, for example one or more individuals and/or vehicles, and/or of bodily harm of an individual, for example a sportsperson, for example during training and/or competition.
32. A method for providing information and/or one or more parameters of an entity (13) , for example a living individual, during one or more selected from the group of: professional activities, training and/or security and/or safety events or exercises, military training events and/or a sports event, the method comprises the steps of:
producing an image (4) and/or a sequence of images of said entity (13);
determining, from data of said image, said information and/or parameter.
EP14825281.0A 2013-12-14 2014-12-15 Camera-based tracking system for the determination of physical, physiological and/or biometric data and/or for risk assessment Withdrawn EP3080752A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP14825281.0A EP3080752A1 (en) 2013-12-14 2014-12-15 Camera-based tracking system for the determination of physical, physiological and/or biometric data and/or for risk assessment

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
EP13197305 2013-12-14
PCT/EP2014/077801 WO2015086855A1 (en) 2013-12-14 2014-12-15 Camera-based tracking system for the determination of physical, physiological and/or biometric data and/or for risk assessment
EP14825281.0A EP3080752A1 (en) 2013-12-14 2014-12-15 Camera-based tracking system for the determination of physical, physiological and/or biometric data and/or for risk assessment

Publications (1)

Publication Number Publication Date
EP3080752A1 true EP3080752A1 (en) 2016-10-19

Family

ID=49916826

Family Applications (1)

Application Number Title Priority Date Filing Date
EP14825281.0A Withdrawn EP3080752A1 (en) 2013-12-14 2014-12-15 Camera-based tracking system for the determination of physical, physiological and/or biometric data and/or for risk assessment

Country Status (3)

Country Link
EP (1) EP3080752A1 (en)
CN (1) CN105917355B (en)
WO (1) WO2015086855A1 (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017187666A1 (en) * 2016-04-26 2017-11-02 三菱電機株式会社 Worker management device
US9858388B1 (en) 2016-09-26 2018-01-02 International Business Machines Corporation Health monitoring using parallel cognitive processing
US9720086B1 (en) 2016-11-22 2017-08-01 4Sense, Inc. Thermal- and modulated-light-based passive tracking system
US9638800B1 (en) 2016-11-22 2017-05-02 4Sense, Inc. Passive tracking system
CN107126224B (en) * 2017-06-20 2018-02-06 中南大学 A kind of Monitoring and forecasting system in real-time method and system of the track train driver status based on Kinect
US10249163B1 (en) 2017-11-10 2019-04-02 Otis Elevator Company Model sensing and activity determination for safety and efficiency
CN108031089A (en) * 2017-11-28 2018-05-15 安徽省蓝翔体育用品有限公司 A kind of system for extending shuttlecock service life
CN111919236A (en) * 2018-02-23 2020-11-10 艾卢诺斯公司 Monitoring of physiological parameters
CN108720825B (en) * 2018-03-29 2020-11-06 合肥工业大学 Multi-camera-based seamless detection method for non-contact vital sign parameters
US11501619B2 (en) 2019-11-22 2022-11-15 Deere & Company Worksite classification system and method
FR3103442B1 (en) * 2019-11-27 2023-08-11 Thales Sa DEVICE AND METHOD FOR AUTONOMOUS MONITORING OF A LEVEL CROSSING
IL275524B (en) 2020-06-18 2021-12-01 Elbit Systems C4I And Cyber Ltd Contactless parameters measurement system and method

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8626472B2 (en) 2006-07-21 2014-01-07 James C. Solinsky System and method for measuring balance and track motion in mammals
CA2706695C (en) 2006-12-04 2019-04-30 Lynx System Developers, Inc. Autonomous systems and methods for still and moving picture production
DE102008002275A1 (en) * 2008-06-06 2009-12-10 Robert Bosch Gmbh Image processing device with calibration module, method for calibration and computer program
WO2012023639A1 (en) * 2010-08-17 2012-02-23 엘지전자 주식회사 Method for counting objects and apparatus using a plurality of sensors
US8903119B2 (en) * 2010-10-11 2014-12-02 Texas Instruments Incorporated Use of three-dimensional top-down views for business analytics
JP5822651B2 (en) * 2011-10-26 2015-11-24 株式会社ソニー・コンピュータエンタテインメント Individual discrimination device and individual discrimination method
KR20130085315A (en) 2012-01-19 2013-07-29 한국전자통신연구원 Method for video surveillance system based on human identification
JP5891061B2 (en) * 2012-02-15 2016-03-22 株式会社日立製作所 Video monitoring apparatus, monitoring system, and monitoring system construction method
US9363859B2 (en) * 2012-04-20 2016-06-07 Rensselaer Polytechnic Institute Sensory lighting system and method for characterizing an illumination space

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
None *
See also references of WO2015086855A1 *

Also Published As

Publication number Publication date
CN105917355A (en) 2016-08-31
CN105917355B (en) 2020-07-03
WO2015086855A1 (en) 2015-06-18

Similar Documents

Publication Publication Date Title
CN105917355B (en) Camera-based tracking system for determining physical, physiological and/or biometric data and/or for risk assessment
US11696611B2 (en) Helmet-based system for improved practice efficiency and athlete safety
US11193840B1 (en) Wearable user input device and sensor system for sports training
US11450106B2 (en) Systems and methods for monitoring objects at sporting events
US10628678B2 (en) Classification of activity derived from multiple locations
US20190054347A1 (en) Wearable sports guidance communication system and developers tool kit
JP6814196B2 (en) Integrated sensor and video motion analysis method
US10124210B2 (en) Systems and methods for qualitative assessment of sports performance
US20200108292A2 (en) System and methods for providing performance feedback
US10115200B2 (en) Systems and methods for analyzing sports impacts
JP6980525B2 (en) Video and motion event integration system
JP2017521017A (en) Motion event recognition and video synchronization system and method
US20080269644A1 (en) Precision Athletic Aptitude and Performance Data Analysis System
US12029941B2 (en) Integrated sports training
JP2018530804A (en) Multi-sensor event detection and tagging system
US20180204474A1 (en) Swim Lap Counting and Timing System and Methods for Event Detection from Noisy Source Data
WO2016097746A1 (en) Biomechanical analysis
CN109069903B (en) System and method for monitoring objects in a sporting event
CA2984000A1 (en) Swim lap counting and timing system and methods for event detection from noisy source data
US20240005519A1 (en) System and method for detection and monitoring of impact
US20220167880A1 (en) Patient position monitoring methods and systems
JP2002248093A (en) Individual power-of-locomotion judging system
ES2890715B2 (en) SYSTEM TO ANALYZE A MOTOR ACTIVITY PRACTICE WITH PARTICIPANTS
WO2022019001A1 (en) Evaluation device, evaluation method, and program
KR20200106277A (en) Method for estimation and analysis of object activity and apparatus for the same

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20160713

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20180227

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20240423