JP7604439B2 - 観察可能な健康状態の兆候のためのロボット対話 - Google Patents
観察可能な健康状態の兆候のためのロボット対話 Download PDFInfo
- Publication number
- JP7604439B2 JP7604439B2 JP2022185610A JP2022185610A JP7604439B2 JP 7604439 B2 JP7604439 B2 JP 7604439B2 JP 2022185610 A JP2022185610 A JP 2022185610A JP 2022185610 A JP2022185610 A JP 2022185610A JP 7604439 B2 JP7604439 B2 JP 7604439B2
- Authority
- JP
- Japan
- Prior art keywords
- individuals
- user
- machine learning
- learning model
- robotic device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000036541 health Effects 0.000 title claims description 90
- 230000003993 interaction Effects 0.000 title description 4
- 208000024891 symptom Diseases 0.000 title description 3
- 238000010801 machine learning Methods 0.000 claims description 68
- 230000009471 action Effects 0.000 claims description 42
- 230000006399 behavior Effects 0.000 claims description 29
- 230000004044 response Effects 0.000 claims description 29
- 238000000034 method Methods 0.000 claims description 25
- 238000012549 training Methods 0.000 claims description 25
- 230000033001 locomotion Effects 0.000 claims description 20
- 230000003542 behavioural effect Effects 0.000 claims description 17
- 230000007613 environmental effect Effects 0.000 claims description 14
- 238000012790 confirmation Methods 0.000 claims description 5
- 238000012544 monitoring process Methods 0.000 claims description 5
- 230000002123 temporal effect Effects 0.000 claims description 4
- 230000008569 process Effects 0.000 description 16
- 238000012545 processing Methods 0.000 description 11
- 238000003066 decision tree Methods 0.000 description 10
- 230000003862 health status Effects 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 8
- 230000000694 effects Effects 0.000 description 7
- 235000013305 food Nutrition 0.000 description 6
- 238000004891 communication Methods 0.000 description 4
- 238000004590 computer program Methods 0.000 description 4
- 206010020751 Hypersensitivity Diseases 0.000 description 3
- 206010037660 Pyrexia Diseases 0.000 description 3
- 230000007815 allergy Effects 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 3
- 230000008921 facial expression Effects 0.000 description 3
- 210000002683 foot Anatomy 0.000 description 3
- 230000005802 health problem Effects 0.000 description 3
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 3
- 241000282472 Canis lupus familiaris Species 0.000 description 2
- UGFAIRIUMAVXCW-UHFFFAOYSA-N Carbon monoxide Chemical compound [O+]#[C-] UGFAIRIUMAVXCW-UHFFFAOYSA-N 0.000 description 2
- 208000000059 Dyspnea Diseases 0.000 description 2
- 206010013975 Dyspnoeas Diseases 0.000 description 2
- 206010061218 Inflammation Diseases 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 2
- 208000003443 Unconsciousness Diseases 0.000 description 2
- 208000027418 Wounds and injury Diseases 0.000 description 2
- 208000026935 allergic disease Diseases 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 2
- 230000036772 blood pressure Effects 0.000 description 2
- 230000036760 body temperature Effects 0.000 description 2
- 229910002091 carbon monoxide Inorganic materials 0.000 description 2
- 230000006378 damage Effects 0.000 description 2
- 238000013136 deep learning model Methods 0.000 description 2
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 2
- 201000006549 dyspepsia Diseases 0.000 description 2
- 230000004054 inflammatory process Effects 0.000 description 2
- 208000014674 injury Diseases 0.000 description 2
- 208000017169 kidney disease Diseases 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000006996 mental state Effects 0.000 description 2
- 208000020016 psychiatric disease Diseases 0.000 description 2
- 208000013220 shortness of breath Diseases 0.000 description 2
- 230000007958 sleep Effects 0.000 description 2
- 235000011888 snacks Nutrition 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 238000010200 validation analysis Methods 0.000 description 2
- 230000002618 waking effect Effects 0.000 description 2
- 206010010904 Convulsion Diseases 0.000 description 1
- 206010011224 Cough Diseases 0.000 description 1
- 206010011469 Crying Diseases 0.000 description 1
- 208000005156 Dehydration Diseases 0.000 description 1
- 206010012289 Dementia Diseases 0.000 description 1
- 208000010201 Exanthema Diseases 0.000 description 1
- 206010068737 Facial asymmetry Diseases 0.000 description 1
- 241000282326 Felis catus Species 0.000 description 1
- WQZGKKKJIJFFOK-GASJEMHNSA-N Glucose Natural products OC[C@H]1OC(O)[C@H](O)[C@@H](O)[C@@H]1O WQZGKKKJIJFFOK-GASJEMHNSA-N 0.000 description 1
- 206010061991 Grimacing Diseases 0.000 description 1
- 206010019196 Head injury Diseases 0.000 description 1
- 241000282412 Homo Species 0.000 description 1
- 206010023126 Jaundice Diseases 0.000 description 1
- 206010024453 Ligament sprain Diseases 0.000 description 1
- 241001282135 Poromitra oscitans Species 0.000 description 1
- 206010041347 Somnambulism Diseases 0.000 description 1
- 206010041349 Somnolence Diseases 0.000 description 1
- 206010048232 Yawning Diseases 0.000 description 1
- 239000013566 allergen Substances 0.000 description 1
- 208000007502 anemia Diseases 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 238000007664 blowing Methods 0.000 description 1
- 235000021152 breakfast Nutrition 0.000 description 1
- 230000001680 brushing effect Effects 0.000 description 1
- 230000000747 cardiac effect Effects 0.000 description 1
- 230000018044 dehydration Effects 0.000 description 1
- 238000006297 dehydration reaction Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 238000004851 dishwashing Methods 0.000 description 1
- 208000035475 disorder Diseases 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 238000002567 electromyography Methods 0.000 description 1
- 230000008451 emotion Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 201000005884 exanthem Diseases 0.000 description 1
- 239000008103 glucose Substances 0.000 description 1
- 239000000383 hazardous chemical Substances 0.000 description 1
- 231100000206 health hazard Toxicity 0.000 description 1
- 208000019622 heart disease Diseases 0.000 description 1
- 208000006454 hepatitis Diseases 0.000 description 1
- 231100000283 hepatitis Toxicity 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 239000004615 ingredient Substances 0.000 description 1
- 230000003340 mental effect Effects 0.000 description 1
- 230000004630 mental health Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 206010037844 rash Diseases 0.000 description 1
- 230000033764 rhythmic process Effects 0.000 description 1
- 238000007665 sagging Methods 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 230000035900 sweating Effects 0.000 description 1
- 230000006016 thyroid dysfunction Effects 0.000 description 1
- 210000001519 tissue Anatomy 0.000 description 1
- 230000002792 vascular Effects 0.000 description 1
- 238000005406 washing Methods 0.000 description 1
- 239000003643 water by type Substances 0.000 description 1
- 230000003442 weekly effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0077—Devices for viewing the surface of the body, e.g. camera, magnifying lens
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1113—Local tracking of patients, e.g. in a hospital or private home
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1116—Determining posture transitions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb using a particular sensing technique
- A61B5/1128—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb using a particular sensing technique using image analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4803—Speech analysis specially adapted for diagnostic purposes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/746—Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/0005—Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
- B25J11/009—Nursing, e.g. carrying sick persons, pushing wheelchairs, distributing drugs
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0003—Home robots, i.e. small robots for domestic use
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/163—Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1661—Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/254—Fusion techniques of classification results, e.g. of results related to same input data
- G06F18/256—Fusion techniques of classification results, e.g. of results related to same input data of results relating to different input data, e.g. multimodal recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/004—Artificial life, i.e. computing arrangements simulating life
- G06N3/008—Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/80—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
- G06V10/809—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data
- G06V10/811—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data the classifiers operating on different input data, e.g. multi-modal recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/35—Categorising the entire scene, e.g. birthday party or wedding scene
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2505/00—Evaluating, monitoring or diagnosing in the context of a particular type of medical care
- A61B2505/07—Home care
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2560/00—Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
- A61B2560/02—Operational features
- A61B2560/0242—Operational features adapted to measure environmental factors, e.g. temperature, pollution
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2560/00—Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
- A61B2560/02—Operational features
- A61B2560/0242—Operational features adapted to measure environmental factors, e.g. temperature, pollution
- A61B2560/0247—Operational features adapted to measure environmental factors, e.g. temperature, pollution for compensation or correction of the measured physiological value
- A61B2560/0252—Operational features adapted to measure environmental factors, e.g. temperature, pollution for compensation or correction of the measured physiological value using ambient temperature
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2560/00—Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
- A61B2560/02—Operational features
- A61B2560/0242—Operational features adapted to measure environmental factors, e.g. temperature, pollution
- A61B2560/0247—Operational features adapted to measure environmental factors, e.g. temperature, pollution for compensation or correction of the measured physiological value
- A61B2560/0257—Operational features adapted to measure environmental factors, e.g. temperature, pollution for compensation or correction of the measured physiological value using atmospheric pressure
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2560/00—Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
- A61B2560/02—Operational features
- A61B2560/0242—Operational features adapted to measure environmental factors, e.g. temperature, pollution
- A61B2560/0247—Operational features adapted to measure environmental factors, e.g. temperature, pollution for compensation or correction of the measured physiological value
- A61B2560/0261—Operational features adapted to measure environmental factors, e.g. temperature, pollution for compensation or correction of the measured physiological value using hydrostatic pressure
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0204—Acoustic sensors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0247—Pressure sensors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0271—Thermal or temperature sensors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7465—Arrangements for interactive communication between patient and care services, e.g. by using a telephone network
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
- G10L2015/223—Execution procedure of a spoken command
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
- G10L2015/226—Procedures used during a speech recognition process, e.g. man-machine dialogue using non-speech characteristics
- G10L2015/228—Procedures used during a speech recognition process, e.g. man-machine dialogue using non-speech characteristics of application context
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Theoretical Computer Science (AREA)
- Pathology (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Robotics (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Multimedia (AREA)
- Artificial Intelligence (AREA)
- Psychiatry (AREA)
- Physiology (AREA)
- Human Computer Interaction (AREA)
- Evolutionary Computation (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Data Mining & Analysis (AREA)
- Social Psychology (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Software Systems (AREA)
- Databases & Information Systems (AREA)
- Mathematical Physics (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Computational Linguistics (AREA)
- Nursing (AREA)
Description
本出願は、「System and Method for Robotic Interactions for Observable Signs of Core Health」と題する2018年5月23日出願の米国仮特許出願第62/675,729号及び、「System and Method for Robotic Interactions for Observable Signs of Intent」と題する2018年5月23日出願の米国仮特許出願第62/675,730号の権利を主張し、これらは双方共に、その開示全体が参照により本明細書中に組み込まれる。
本開示は、概略的にはロボットに関し、詳細には支援ロボットに関する。
本明細書に、家庭、職場、ヘルスケア施設又はその他の施設で、健康状態の兆候、健康を脅かす危険及び/又は健康上の懸念がある状態の兆候を観察する支援ロボットを記載する。支援ロボットが、危険な状況を回避する措置を取り、健康上の問題を診断し、援助の要請に応え、人の医学的状態を定期的に処理又は分析してもよい。本明細書に、一人以上の人(又は動物)の必要を予測する支援ロボットを記載する。支援ロボットが、現在の活動、人の日課に関する知識及びコンテキスト情報を認識してもよい。支援ロボットが、適切なロボット支援の提供又は提供の申し出を行ってもよい。
図1は、一実施形態に基づく、支援ロボットを管理するためのシステム環境の図である。システム環境は、支援プラットフォーム120、クライアント装置110、支援ロボット102、装置106を含み、これら全てがネットワーク140を介して接続されている。別の複数の実施形態では、異なる、及び/又は更なる実体がシステムアーキテクチャ内に含まれてもよい。環境が、住環境、ヘルスケア環境又は職場環境であってもよい。
Claims (18)
- ロボット装置であって、
複数のセンサであって、
環境内の複数の個人の画像をキャプチャし、
前記環境の環境データを生成する、
ように構成された複数のセンサと、
プロセッサと、
メモリであって、前記プロセッサに、
前記複数の個人によって実行された一連の過去の動作及び前記環境データを認識して、前記複数の個人の一連の過去の動作を生成することと、
前記複数の個人の一連の過去の動作を機械学習モデルに入力して、1つ以上の行動特徴であって、各々が前記複数の個人のうちの一個人又は一組の個人に固有である1つ以上の行動特徴を前記機械学習モデルが学習できるようにすることと、
前記複数の個人のうちの特定の一個人によって実行された新たな動作を特定することと、
前記新たな動作を前記機械学習モデルに入力することと、
前記機械学習モデルを使用して、前記新たな動作を、前記機械学習モデルによって学習された前記1つ以上の行動特徴と比較することと、
前記機械学習モデルによって、前記新たな動作が前記特定の一個人に健康上の危険をもたらすと判定することと、
前記ロボット装置が実行する応答を、前記機械学習モデルの出力に基づいて決定することと、
を実行させるように構成された命令を記憶しているメモリと、
を備える、ロボット装置。 - 前記新たな動作が前記特定の一個人に健康上の危険をもたらすと判定することは、前記複数の個人によって実行された前記一連の過去の動作及び前記環境データを機械学習モデルに提供することを含み、前記機械学習モデルは前記行動特徴を認識するように訓練される、請求項1に記載のロボット装置。
- 前記機械学習モデルは、参照行動特徴を記述するラベルを含む訓練データを使用することによって訓練される、請求項2に記載のロボット装置。
- 前記新たな動作が前記特定の一個人に健康上の危険をもたらすと判定することは、前記一連の過去の動作が1つ以上の所定の行動特徴に含まれるか否かを判定することを含み、所定の行動特徴は所定の健康上の危険に対応する、請求項1に記載のロボット装置。
- 前記複数の個人によって実行された前記一連の過去の動作及び前記環境データを分析することは、前記一連の過去の動作のそれぞれの時間的特性を判定することを含み、前記健康上の危険は少なくとも前記時間的特性に基づいて判定される、請求項1に記載のロボット装置。
- 前記環境データは、周囲温度、時刻、場所、物体及び、前記複数の個人に対する前記物体の位置のうちの少なくとも1つを含む、請求項1に記載のロボット装置。
- 前記複数の個人によって実行された前記一連の過去の動作及び前記環境データを分析することは、
前記一連の過去の動作がコンテキストに対応する行動スケジュールに類似していると判定することに応答して、健康上の危険を行動習慣に基づいて判定することを含む、
請求項1に記載のロボット装置。 - 前記行動スケジュールを、前記複数の個人の動作履歴、前記環境データから決定することを更に含む、請求項7に記載のロボット装置。
- 前記一連の過去の動作は、前記個人に追従すること、前記個人に警告すること、前記個人を監視すること、前記個人を支援すること及び警告を別のユーザに送信することのうちの少なくとも1つを含む、請求項1に記載のロボット装置。
- 前記新たな動作は、物体を取ってくること、装置をオンにすること、装置をオフにすること、インターネットを検索すること及び、情報を提供することのうちの少なくとも1つを含む、請求項1に記載のロボット装置。
- 前記一連の過去の動作は、
前記1つ以上の仕事を前記複数の個人に提供することと、
前記複数の個人から確認を受け取ることと、
前記複数の個人からの前記確認に基づいて、前記機械学習モデルを更新することと、
を含む、請求項1に記載のロボット装置。 - 少なくともロボットアーム及び移動システムを更に備える、請求項10に記載のロボット装置。
- 前記複数のセンサは、カメラ、マイクロフォン、位置センサ、深度センサ、圧力センサ、接触センサ、気圧計、温度計、湿度計及びガス検出器のうちの少なくとも1つを含む、請求項10に記載のロボット装置。
- 方法であって、
ロボット装置により、環境内の複数の個人の画像をキャプチャすることと、
前記ロボット装置により、前記環境の環境データを生成することと、
前記ロボット装置により、前記複数の個人によって実行された一連の過去の動作及び前記環境データを認識して、前記複数の個人の一連の過去の動作を生成することと、
前記複数の個人の一連の過去の動作を機械学習モデルに入力して、1つ以上の行動特徴であって、各々が前記複数の個人のうちの一個人又は一組の個人に固有である1つ以上の行動特徴を前記機械学習モデルが学習できるようにすることと、
前記複数の個人のうちの特定の一個人によって実行された新たな動作を特定することと、
前記機械学習モデルによって、前記新たな動作が前記特定の一個人に健康上の危険をもたらすと判定することと、
前記ロボット装置が実行する応答を、前記機械学習モデルの出力に基づいて決定することと、
を含む方法。 - 前記新たな動作が前記特定の一個人に健康上の危険をもたらすと判定することは、前記複数の個人によって実行された前記一連の過去動作及び前記環境データを機械学習モデルに提供することを含み、前記機械学習モデルは、前記行動特徴を認識するように訓練される、請求項14に記載の方法。
- 参照行動特徴を記述するラベルを含む訓練データを使用することによって前記機械学習モデルを訓練することを更に含む、請求項15に記載の方法。
- 前記複数の個人によって実行された前記一連の過去の動作及び前記環境データを分析することは、
前記一連の過去の動作が前記コンテキストに対応する行動スケジュールに類似していると判定することに応答して、前記健康上の危険を行動習慣に基づいて判定することを含む、
請求項14に記載の方法。 - 前記行動スケジュールを、前記複数の個人の動作履歴及び前記環境データから決定することを更に含む、請求項17に記載の方法。
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201862675730P | 2018-05-23 | 2018-05-23 | |
US201862675729P | 2018-05-23 | 2018-05-23 | |
US62/675,730 | 2018-05-23 | ||
US62/675,729 | 2018-05-23 | ||
JP2020565775A JP7299245B2 (ja) | 2018-05-23 | 2019-05-23 | 観察可能な健康状態の兆候のためのロボット対話 |
PCT/US2019/033842 WO2019226948A1 (en) | 2018-05-23 | 2019-05-23 | Robotic interactions for observable signs of core health |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2020565775A Division JP7299245B2 (ja) | 2018-05-23 | 2019-05-23 | 観察可能な健康状態の兆候のためのロボット対話 |
Publications (2)
Publication Number | Publication Date |
---|---|
JP2023026707A JP2023026707A (ja) | 2023-02-27 |
JP7604439B2 true JP7604439B2 (ja) | 2024-12-23 |
Family
ID=68613823
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2020565775A Active JP7299245B2 (ja) | 2018-05-23 | 2019-05-23 | 観察可能な健康状態の兆候のためのロボット対話 |
JP2022185610A Active JP7604439B2 (ja) | 2018-05-23 | 2022-11-21 | 観察可能な健康状態の兆候のためのロボット対話 |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2020565775A Active JP7299245B2 (ja) | 2018-05-23 | 2019-05-23 | 観察可能な健康状態の兆候のためのロボット対話 |
Country Status (3)
Country | Link |
---|---|
US (3) | US11701041B2 (ja) |
JP (2) | JP7299245B2 (ja) |
WO (1) | WO2019226948A1 (ja) |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3844778A1 (en) * | 2018-10-09 | 2021-07-07 | Valotec | Digital companion for healthcare |
JP7375770B2 (ja) * | 2018-12-07 | 2023-11-08 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、およびプログラム |
US12094196B2 (en) * | 2019-12-03 | 2024-09-17 | Samsung Electronics Co., Ltd. | Robot and method for controlling thereof |
CN113021362A (zh) * | 2019-12-09 | 2021-06-25 | 詹丽燕 | 一种集健康行为干预与随访的养生ai识别机器人 |
US10896598B1 (en) * | 2020-01-03 | 2021-01-19 | International Business Machines Corporation | Ambient situational abnormality detection and response |
CN111365832A (zh) * | 2020-03-13 | 2020-07-03 | 北京云迹科技有限公司 | 一种机器人及信息处理方法 |
DE102020204083A1 (de) | 2020-03-30 | 2021-09-30 | BSH Hausgeräte GmbH | Computerprogrammprodukt für einen Roboter zum Betreiben einer Haushalts-Geschirrspülmaschine und System mit einer Haushalts-Geschirrspülmaschine und einem Computerprogrammprodukt für einen Roboter |
US11205314B2 (en) | 2020-05-13 | 2021-12-21 | Motorola Solutions, Inc. | Systems and methods for personalized intent prediction |
WO2021254427A1 (zh) * | 2020-06-17 | 2021-12-23 | 谈斯聪 | 超声图像数据采集分析识别一体化机器人,平台 |
CN112001248B (zh) * | 2020-07-20 | 2024-03-01 | 北京百度网讯科技有限公司 | 主动交互的方法、装置、电子设备和可读存储介质 |
US20220203545A1 (en) * | 2020-12-31 | 2022-06-30 | Sarcos Corp. | Smart Control System for a Robotic Device |
CA3157774A1 (en) * | 2021-05-05 | 2022-11-05 | Sanctuary Cognitive Systems Corporation | Robots, tele-operation systems, and methods of operating the same |
CN113855250A (zh) * | 2021-08-27 | 2021-12-31 | 谈斯聪 | 一种医疗用机器人装置、系统及方法 |
CN113843813A (zh) * | 2021-10-27 | 2021-12-28 | 北京小乔机器人科技发展有限公司 | 一种具有问诊、日常诊断和建档功能的机器人 |
US20220111528A1 (en) * | 2021-12-22 | 2022-04-14 | Intel Corporation | Unintended human action detection in an amr environment |
US20240197420A1 (en) * | 2022-12-15 | 2024-06-20 | Geoffrey Lee Ruben | Healthcare assistive robot apparatus |
JP2024147218A (ja) * | 2023-04-03 | 2024-10-16 | 株式会社島津製作所 | 支援装置 |
KR102677867B1 (ko) * | 2023-05-08 | 2024-06-26 | (주)로보케어 | 스케줄링 방법 및 장치 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003225228A (ja) | 2002-01-31 | 2003-08-12 | Sanyo Electric Co Ltd | 健康管理端末装置,コンピュータプログラム及び記録媒体 |
JP2004337556A (ja) | 2003-05-13 | 2004-12-02 | Yasuo Fujii | 生体情報取得手段を具備し健康管理機能を備えたロボット |
JP2010140119A (ja) | 2008-12-09 | 2010-06-24 | Tokyo Metropolitan Univ | ユーザ健康維持活性化支援及び見守りシステム |
US20160193732A1 (en) | 2013-03-15 | 2016-07-07 | JIBO, Inc. | Engaging in human-based social interaction with members of a group using a persistent companion device |
JP2016154696A (ja) | 2015-02-24 | 2016-09-01 | パラマウントベッド株式会社 | 患者の見守りシステム |
JP2017010518A (ja) | 2015-06-24 | 2017-01-12 | バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド | 人工知能に基づく知能ロボットの制御システム、方法及び装置 |
JP2017098180A (ja) | 2015-11-27 | 2017-06-01 | 株式会社レイトロン | 照明装置 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002230555A (ja) * | 2001-02-01 | 2002-08-16 | Noa Syst:Kk | 動きを検知する検知装置及び方法 |
JP2005237668A (ja) | 2004-02-26 | 2005-09-08 | Kazuya Mera | コンピュータネットワークにおける感情を考慮した対話装置 |
US9814425B2 (en) * | 2006-05-12 | 2017-11-14 | Koninklijke Philips N.V. | Health monitoring appliance |
US8909370B2 (en) * | 2007-05-08 | 2014-12-09 | Massachusetts Institute Of Technology | Interactive systems employing robotic companions |
US20110263946A1 (en) | 2010-04-22 | 2011-10-27 | Mit Media Lab | Method and system for real-time and offline analysis, inference, tagging of and responding to person(s) experiences |
US10540597B1 (en) | 2014-06-25 | 2020-01-21 | Bosch Sensortec Gmbh | Method and apparatus for recognition of sensor data patterns |
US10775314B2 (en) | 2017-11-10 | 2020-09-15 | General Electric Company | Systems and method for human-assisted robotic industrial inspection |
US20190184569A1 (en) | 2017-12-18 | 2019-06-20 | Bot3, Inc. | Robot based on artificial intelligence, and control method thereof |
US20230058605A1 (en) * | 2019-10-03 | 2023-02-23 | Rom Technologies, Inc. | Method and system for using sensor data to detect joint misalignment of a user using a treatment device to perform a treatment plan |
-
2019
- 2019-05-23 US US16/421,126 patent/US11701041B2/en active Active
- 2019-05-23 WO PCT/US2019/033842 patent/WO2019226948A1/en active Application Filing
- 2019-05-23 JP JP2020565775A patent/JP7299245B2/ja active Active
- 2019-05-23 US US16/421,120 patent/US11717203B2/en active Active
-
2022
- 2022-11-21 JP JP2022185610A patent/JP7604439B2/ja active Active
-
2023
- 2023-06-26 US US18/214,158 patent/US20240115174A1/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003225228A (ja) | 2002-01-31 | 2003-08-12 | Sanyo Electric Co Ltd | 健康管理端末装置,コンピュータプログラム及び記録媒体 |
JP2004337556A (ja) | 2003-05-13 | 2004-12-02 | Yasuo Fujii | 生体情報取得手段を具備し健康管理機能を備えたロボット |
JP2010140119A (ja) | 2008-12-09 | 2010-06-24 | Tokyo Metropolitan Univ | ユーザ健康維持活性化支援及び見守りシステム |
US20160193732A1 (en) | 2013-03-15 | 2016-07-07 | JIBO, Inc. | Engaging in human-based social interaction with members of a group using a persistent companion device |
JP2016154696A (ja) | 2015-02-24 | 2016-09-01 | パラマウントベッド株式会社 | 患者の見守りシステム |
JP2017010518A (ja) | 2015-06-24 | 2017-01-12 | バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド | 人工知能に基づく知能ロボットの制御システム、方法及び装置 |
JP2017098180A (ja) | 2015-11-27 | 2017-06-01 | 株式会社レイトロン | 照明装置 |
Also Published As
Publication number | Publication date |
---|---|
US20190358822A1 (en) | 2019-11-28 |
US20190358820A1 (en) | 2019-11-28 |
JP7299245B2 (ja) | 2023-06-27 |
US11701041B2 (en) | 2023-07-18 |
JP2023026707A (ja) | 2023-02-27 |
US11717203B2 (en) | 2023-08-08 |
US20240115174A1 (en) | 2024-04-11 |
JP2021525421A (ja) | 2021-09-24 |
WO2019226948A1 (en) | 2019-11-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7604439B2 (ja) | 観察可能な健康状態の兆候のためのロボット対話 | |
Pham et al. | Delivering home healthcare through a cloud-based smart home environment (CoSHE) | |
Martinez-Martin et al. | Personal robot assistants for elderly care: an overview | |
JP6495486B2 (ja) | 自律行動型ロボット及びコンピュータプログラム | |
Rashidi et al. | A survey on ambient-assisted living tools for older adults | |
Lattanzio et al. | Advanced technology care innovation for older people in Italy: necessity and opportunity to promote health and wellbeing | |
JP6114470B2 (ja) | ヘルスケア意思決定支援システム、患者ケアシステム及びヘルスケア意思決定方法 | |
Pavel et al. | The role of technology and engineering models in transforming healthcare | |
WO2017147552A1 (en) | Multi-format, multi-domain and multi-algorithm metalearner system and method for monitoring human health, and deriving health status and trajectory | |
JP7234572B2 (ja) | ケアシステム、その管理方法、およびプログラム | |
Arif et al. | A review on the technologies and services used in the self-management of health and independent living of elderly | |
CN112135568A (zh) | 信息处理装置、信息处理方法和程序 | |
CN116945156A (zh) | 一种基于计算机视觉技术的老年人智能陪护系统 | |
Tekemetieu et al. | Context modelling in ambient assisted living: Trends and lessons | |
US20240197420A1 (en) | Healthcare assistive robot apparatus | |
Tamilselvi et al. | Digital companion for elders in tracking health and intelligent recommendation support using deep learning | |
Zhang et al. | Evaluation of Smart Agitation Prediction and Management for Dementia Care and Novel Universal Village Oriented Solution for Integration, Resilience, Inclusiveness and Sustainability | |
Abeydeera et al. | Smart mirror with virtual twin | |
JP7435965B2 (ja) | 情報処理装置、情報処理方法、学習モデルの生成方法、及びプログラム | |
Borghese et al. | Heterogeneous Non Obtrusive Platform to Monitor, Assist and Provide Recommendations to Elders at Home: The MoveCare Platform | |
Newcombe | Investigation of Low-Cost Wearable Internet of Things Enabled Technology for Physical Activity Recognition in the Elderly | |
Boger et al. | Examples of ZETs | |
Sindhu et al. | IoT-Based Monitorization and Caliber Checker With Multiple Decision Making Using Faster R-CNN and Kalman Filter for Visually Impaired Elders: IoT-Based Old Age Health Monitoring | |
Abdelkawy | Hybrid approaches for context recognition in Ambient Assisted Living systems: application to emotion recognition and human activity recognition and anticipation | |
Ovalles-Pabón | State of the art on technological trends for the analysis of behavior and human activities |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A621 | Written request for application examination |
Free format text: JAPANESE INTERMEDIATE CODE: A621 Effective date: 20221216 |
|
A977 | Report on retrieval |
Free format text: JAPANESE INTERMEDIATE CODE: A971007 Effective date: 20240117 |
|
A131 | Notification of reasons for refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A131 Effective date: 20240130 |
|
A601 | Written request for extension of time |
Free format text: JAPANESE INTERMEDIATE CODE: A601 Effective date: 20240425 |
|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20240627 |
|
A131 | Notification of reasons for refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A131 Effective date: 20240709 |
|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20241009 |
|
TRDD | Decision of grant or rejection written | ||
A01 | Written decision to grant a patent or to grant a registration (utility model) |
Free format text: JAPANESE INTERMEDIATE CODE: A01 Effective date: 20241119 |
|
A61 | First payment of annual fees (during grant procedure) |
Free format text: JAPANESE INTERMEDIATE CODE: A61 Effective date: 20241211 |
|
R150 | Certificate of patent or registration of utility model |
Ref document number: 7604439 Country of ref document: JP Free format text: JAPANESE INTERMEDIATE CODE: R150 |