Nothing Special   »   [go: up one dir, main page]

WO2022190187A1 - Système de fourniture d'informations de vibration, dispositif serveur de fourniture d'informations de vibration, dispositif d'acquisition d'informations de vibration, procédé de fourniture d'informations de vibration, programme pour la fourniture d'informations de vibration et programme pour l'acquisition d'informations de vibration - Google Patents

Système de fourniture d'informations de vibration, dispositif serveur de fourniture d'informations de vibration, dispositif d'acquisition d'informations de vibration, procédé de fourniture d'informations de vibration, programme pour la fourniture d'informations de vibration et programme pour l'acquisition d'informations de vibration Download PDF

Info

Publication number
WO2022190187A1
WO2022190187A1 PCT/JP2021/009158 JP2021009158W WO2022190187A1 WO 2022190187 A1 WO2022190187 A1 WO 2022190187A1 JP 2021009158 W JP2021009158 W JP 2021009158W WO 2022190187 A1 WO2022190187 A1 WO 2022190187A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
vibration
unit
vibration information
diagnostic
Prior art date
Application number
PCT/JP2021/009158
Other languages
English (en)
Japanese (ja)
Inventor
理絵子 鈴木
Original Assignee
株式会社ファセテラピー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ファセテラピー filed Critical 株式会社ファセテラピー
Priority to JP2023504902A priority Critical patent/JP7554515B2/ja
Priority to PCT/JP2021/009158 priority patent/WO2022190187A1/fr
Publication of WO2022190187A1 publication Critical patent/WO2022190187A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state

Definitions

  • the present invention relates to a vibration information providing system, a vibration information providing server device, a vibration information acquiring device, a vibration information providing method, a vibration information providing program, and a vibration information acquiring program, and particularly to providing vibration information having a predetermined tactile quality. It is related to the technology for
  • target information such as voice information, image information, text information, heartbeat, body movement, and blood flow rate is analyzed to identify the tactile mode, and vibration information (predetermined tactile sensation) is provided to the user according to the tactile mode.
  • vibration information predetermined tactile sensation
  • a system is known in which tactile content generated so as to make an image is supplied to a content utilization device (see, for example, Patent Document 1).
  • tactile content that causes a specific tactile sensation by vibration is appropriately selected according to the tactile quality of the target information used by the user or the target information related to the user's biological body. It becomes possible to provide
  • Patent Document 1 supplies tactile content that matches the tactile quality analyzed from target information to a content-using device. , beauty instruments, in-vehicle equipment, home appliances, audio equipment, virtual experience equipment, medical equipment, welfare equipment, toys, game machines or their controllers, switches and buttons, vibration alarms, touch panels with vibration feedback, etc. However, it does not specifically describe how the haptic content is used in the medical field.
  • a sensor detects information about the body tissue of the subject and A device is known that generates a stimulation signal based on a detection result and applies vibration stimulation to tendons of muscles of a plurality of fingers of a worker holding a tool (see, for example, Patent Document 2).
  • the tactile force sense information presentation device described in Patent Document 2 can be used by a worker who holds a special tool necessary for work and concentrates his vision on the work without reducing workability and operability. , to present information tactilely.
  • a contact state between the head portion of the endoscope and the subject is detected by a sensor, a stimulus signal having a magnitude corresponding to the degree of contact is generated, and a plurality of fingers of an operator holding the endoscope are generated. Vibration stimulation is applied to the tendons of each muscle.
  • Patent Documents 3 to 5 There are also known therapeutic devices that generate vibrations according to symptoms (see Patent Documents 3 to 5, for example).
  • a bone conduction speaker is applied as a vibrator of a massager, and a signal with a frequency and output that matches the symptom is sent to the bone conduction speaker to vibrate it.
  • a plurality of massage modes are displayed on a massage selection screen, and a vibrator is operated based on the massage mode selected and input by the user to perform massage according to the symptom of stiffness. I am trying to provide an effect.
  • the massage device described in Patent Document 5 uses an artificial intelligence unit to make a diagnosis according to the user based on the data on the doctor's knowledge and know-how based on the user's voice information input by dialogue with the user. Based on the diagnosis result, it is configured to recommend a suitable massage course.
  • Patent Literature 5 also discloses that a diagnosis is made in consideration of physical conditions such as the user's pulse rate, body weight, body fat percentage, and stress index.
  • the tactile force sense information presentation device described in Patent Document 2 conveys information useful for the work to a worker performing medical treatment in the form of vibration stimulation. It does not apply vibration stimulation to the subject who is in the room.
  • the therapy devices described in Patent Documents 3 to 5 vibrations corresponding to symptoms are given to the user, but there is no mention of selecting vibrations corresponding to symptoms from the viewpoint of tactile sensation.
  • An object of the present invention is to provide the user with vibration information related to vibration that produces an appropriate tactile sensation according to the user's physical or mental state.
  • the present invention acquires diagnostic information about a user's physical condition or mental condition, and selects among a plurality of pieces of vibration information about vibrations with different tactile qualities (characteristics of vibrations that cause specific tactile sensations). Therefore, the vibration information about the vibration having the tactile quality corresponding to the diagnostic information is acquired and provided to the user.
  • vibration information related to vibration that produces an appropriate tactile sensation according to the physical or mental state of the user.
  • the diagnostic information about the physical or mental state caused by the symptom provides vibration information about vibration that produces a specific tactile sensation from the viewpoint of alleviating or improving the symptom. be able to.
  • FIG. 2 is a block diagram showing an example functional configuration of a user terminal according to the first embodiment
  • FIG. 3 is a block diagram showing a functional configuration example of a server device according to the first embodiment
  • FIG. It is a figure which shows an example of the screen displayed on a display by a face image display part.
  • FIG. 4 is a diagram showing an example of classification of diagnostic information and vibration information; It is a figure for demonstrating two tactile parameters.
  • FIG. 11 is a diagram for explaining another example of section division;
  • FIG. 4 is a diagram showing an example of classification of diagnostic information and vibration information;
  • FIG. 4 is a diagram showing an example of classification of diagnostic information and vibration information
  • FIG. 11 is a block diagram showing an example functional configuration of a user terminal according to the second embodiment
  • FIG. 11 is a block diagram showing an example of the functional configuration of a server device according to the second embodiment
  • FIG. 12 is a block diagram showing an example of functional configuration of a user terminal according to the third embodiment
  • FIG. 11 is a block diagram showing an example of the functional configuration of a medical institution terminal according to the third embodiment
  • FIG. 11 is a block diagram showing an example of the functional configuration of a server device according to a third embodiment
  • FIG. 1 is a diagram showing an overall configuration example of a vibration information providing system according to a first embodiment.
  • the vibration information providing system according to the first embodiment includes a user terminal 100 used by a user, a medical institution terminal 200 used in a medical institution, and signals from the user terminal 100 and the medical institution terminal 200. and a server device 300 that receives access and provides information.
  • the user terminal 100 and the server device 300 and the medical institution terminal 200 and the server device 300 are connected via a communication network 500 such as the Internet or a mobile phone network.
  • the user terminal 100 is configured by, for example, a personal computer, tablet, smartphone, or the like.
  • the medical institution terminal 200 is configured by, for example, a personal computer, a tablet, or a smart phone.
  • the user terminal 100 has a camera for capturing images (either the user terminal 100 itself is equipped with a camera or the user terminal 100 is configured to be connectable with a camera).
  • the user terminal 100 and the medical institution terminal 200 have displays for displaying images and the like.
  • the user terminal 100 functions as a vibration information acquisition device
  • the server device 300 functions as a vibration information providing server device. That is, in the first embodiment, the camera of the user terminal 100 captures an image of the user's face and transmits the captured image to the server device 300 .
  • the server device 300 analyzes the photographed image of the face transmitted from the user terminal 100 , selects vibration information according to the diagnosis result based on the face image, and provides the user terminal 100 with the vibration information.
  • the vibration information provided here is selected according to the diagnosis result regarding the user's physical condition or mental condition analyzed from the captured image of the face, and is vibration related to vibration having a tactile quality that causes a specific tactile sensation.
  • the vibration information is waveform information (either analog information or digital information) that serves as a vibration source for generating vibration by a vibrator or the like. Using this vibration information, the user can apply vibrations generated by a vibrator to his or her face, thereby alleviating or improving physical or mental disorders and symptoms.
  • the server device 300 also provides the user terminal 100 with diagnostic information indicating the results of diagnosis performed by the server device 300 using the captured image of the face in association with the captured image of the face transmitted from the user terminal 100 .
  • the server device 300 provides the medical institution terminal 200 with the photographed image of the face transmitted from the user terminal 100 and diagnostic information generated by the server device 300 using the photographed image.
  • the photographed image of the face and the diagnostic information are shared between the user and the medical institution.
  • the user can use the shared information to have a dialogue with the medical institution when using the vibration information to improve the disorder or symptoms, and it is also possible to receive appropriate advice from the medical institution.
  • the user terminal 100 has a function of transmitting a photographed face image to the server device 300, a function of receiving vibration information and diagnostic information from the server device 300, and displaying the photographed face image and diagnostic information on the display.
  • An application having functions hereinafter referred to as vibration-related application is installed.
  • FIG. 2 is a block diagram showing a functional configuration example of the user terminal 100 according to the first embodiment.
  • the user terminal 100 includes, as a functional configuration, a facial image capturing unit 11, a facial image recording unit 12, a facial image transmitting unit 13, a vibration information receiving unit 14, a vibration information recording unit 15, and a diagnostic information receiving unit. 16 , a diagnostic information recording unit 17 , a vibration applying unit 18 and a face image display unit 19 .
  • the user terminal 100 also includes a face image storage unit 10A and a vibration information storage unit 10B as storage media.
  • Each of the above functional blocks 11 to 19 can be configured by hardware, DSP (Digital Signal Processor), or software.
  • DSP Digital Signal Processor
  • each of the functional blocks 11 to 19 is actually configured with a computer CPU, RAM, ROM, etc., and vibration stored in a storage medium such as RAM, ROM, hard disk, or semiconductor memory. It is realized by the operation of related application programs.
  • the functions of the facial image photographing unit 11, the facial image transmitting unit 13, the vibration information receiving unit 14, and the diagnostic information receiving unit 16 are realized by the vibration information acquisition program.
  • the face image capturing unit 11 uses the camera provided in the user terminal 100 to capture the user's face. Actually, the face image capturing unit 11 captures an image including the user's face and its surrounding background.
  • the face image recording unit 12 stores the photographed image of the face obtained by the photographing by the face image photographing unit 11 in the face image storage unit 10A together with information indicating the photographing date and time. Storing the photographed image together with the photographing date/time information includes storing the photographed image in a manner in which the photographing date/time information is included as one of the attribute information attached to the image data (image file).
  • the facial image transmission unit 13 transmits to the server device 300 the photographed image of the face obtained by the photographing by the facial image photographing unit 11 .
  • the face image transmission unit 13 extracts only the user's face from the captured image including the background, and transmits the image of the extracted face to the server device 300 .
  • the photographed image of the face including the background may be transmitted to the server device 300 as it is, and the server device 300 may extract only the face of the user from the photographed image including the background.
  • the server device 300 may recognize the user's facial portion in the photographed image including the background, and perform analysis for diagnosis on the recognized facial portion.
  • a photographed image of a face with or without a background may simply be referred to as a face image.
  • the face image transmitting unit 13 transmits the captured image to the server device 300.
  • the face image transmission unit 13 After the face image captured by the face image capturing unit 11 is stored in the face image storage unit 10A by the face image recording unit 12, the face image transmission unit 13, at an arbitrary timing designated by the user, It is also possible to read the photographed image from the face image storage unit 10A and transmit it to the server device 300.
  • the facial image transmission unit 13 transmits the photographed image of the user's face to the server device 300 together with the photographing date/time information and the user ID.
  • the user ID is identification information set for the vibration-related application when installing the vibration-related application in the user terminal 100, for example.
  • the shooting date/time information is used as information for specifying the user's face image.
  • a user ID is used as information for identifying a user.
  • the vibration information receiving unit 14 receives vibration information provided from the server device 300 .
  • the details of the function of server device 300 to provide vibration information will be described later.
  • the vibration information recording unit 15 stores the vibration information received by the vibration information receiving unit 14 in the vibration information storage unit 10B.
  • the diagnostic information receiving unit 16 receives diagnostic information provided from the server device 300 in association with the photographed image of the face. As will be described later, the association between the diagnostic information and the photographed image of the face is performed based on the user ID and photographing date and time information. The details of the function of the server device 300 to provide diagnostic information will be described later.
  • the diagnostic information recording unit 17 stores the diagnostic information received by the diagnostic information receiving unit 16 in the face image storage unit 10A in association with the photographed image of the face to be diagnosed.
  • the photographed image of the face to be diagnosed is specified by the photographing date and time information.
  • the vibration applying unit 18 performs processing for applying vibration to the user's face by supplying the vibration information stored in the vibration information storage unit 10B to a vibrator for generating vibration.
  • the vibrator may be equipped with the user terminal 100, or may be equipped with a vibration supplier connected to the user terminal 100 by wire or wirelessly.
  • the vibration supplier may be a sheet or mask that is put on the user's face, or a pad that is put on the user's face.
  • the vibration supplier may be a speaker unit that reproduces low frequencies, such as a woofer, and instead of applying vibrations directly to the user, low-frequency vibrations may be emitted into the space where the user is present. .
  • the vibration information stored in the vibration information storage unit 10B is supplied to the vibrator by operating a user interface for instructing vibration generation provided by a vibration-related application, for example. By doing so, the user terminal 100 is brought into contact with the user's face in a state of generating vibration.
  • the vibration information storage unit 10B is operated by operating the user interface for instructing vibration generation while the vibration supplier is in contact with the user's face. vibrate the vibration supplier by supplying the vibration information stored in the vibrator to the vibrator.
  • the vibration supplier is a speaker unit
  • the speaker unit is installed at an arbitrary place in the space, and the vibration information stored in the vibration information storage unit 10B is supplied to the vibrator, thereby emitting vibration to the space.
  • the facial image display unit 19 causes the display to display the user's facial image stored in the facial image storage unit 10A and diagnostic information associated with the facial image.
  • FIG. 4 is a diagram showing an example of a screen displayed on the display by the face image display section 19. As shown in FIG. As shown in FIG. 4, the face image display unit 19 displays one or more pieces of set information, with a photographed image of the face, information indicating the photographing date and time of the photographed face, and diagnosis information acquired from the server device 300 as one piece of set information. List on screen.
  • the user captures an image of the face on a certain date and time and transmits the captured image to the server device 300 .
  • the user terminal 100 acquires from the server device 300 vibration information and diagnosis information obtained by analyzing the captured image.
  • the photographed image of the face obtained at this time, photographing date and time information, and diagnostic information constitute one set of information, which are stored in the face image storage unit 10A in association with each other.
  • the user uses the vibration information acquired at this time, the user causes the vibrator to generate vibration and vibrate the face.
  • the user takes an image of the face on another date and time and transmits the taken image to the server device 300 .
  • the user terminal 100 acquires from the server device 300 vibration information and diagnosis information obtained by analyzing the captured image.
  • the photographed image of the face obtained at this time, photographing date and time information, and diagnostic information are separate set information, and are stored in the face image storage unit 10A in association with each other.
  • the user applies vibration to the face using the vibration information acquired at this time.
  • One or more pieces of set information sequentially accumulated in the face image storage unit 10A in this manner are displayed by the face image display unit 19 as a list as shown in FIG.
  • This list display allows the user to confirm changes in the facial image and diagnostic information resulting from the application of vibration.
  • FIG. 3 is a block diagram showing a functional configuration example of the server device 300 according to the first embodiment.
  • the server device 300 of the present embodiment includes a diagnostic information acquiring unit 31, a vibration information acquiring unit 32, a vibration information providing unit 33, a diagnostic information providing unit 34, and a diagnostic information providing unit 35 as functional configurations. It has The diagnostic information acquisition unit 31 has a face image acquisition unit 31a and an image analysis unit 31b as more specific functional configurations.
  • the server device 300 also includes a face image storage section 30A and a vibration information storage section 30B as storage media.
  • Each of the functional blocks 31 to 35 can be configured by hardware, DSP, or software.
  • each of the functional blocks 31 to 35 is actually configured with a computer CPU, RAM, ROM, etc., and vibration stored in a storage medium such as RAM, ROM, hard disk, or semiconductor memory. It is realized by running the information providing program.
  • the diagnostic information acquisition unit 31 acquires diagnostic information regarding the user's physical condition or mental condition. Specifically, the diagnostic information acquisition unit 31 acquires diagnostic information by the face image acquisition unit 31a and the image analysis unit 31b.
  • the facial image acquiring unit 31a acquires the photographed image of the user's face transmitted by the facial image transmitting unit 13 of the user terminal 100, and stores it in the facial image storage unit 30A together with the photographing date/time information and the user ID.
  • the image analysis unit 31b acquires diagnostic information by analyzing the physical state or mental state of the user based on the photographed image of the face acquired by the face image acquisition unit 31a, and obtains the photographed image of the face (image date and time information). and stored in the face image storage unit 30A in association with the user ID.
  • the image analysis unit 31b analyzes the photographed image of the face acquired by the face image acquisition unit 31a to obtain hardness information indicating the degree of hardness of the facial expression and facial expression. Distortion information indicating the degree of distortion is acquired as diagnostic information.
  • the hardness information indicating the degree of hardness of facial expressions is information that classifies the degree of hardness of facial expressions into a plurality of levels.
  • the degrees of facial expression hardness are classified into two levels.
  • the facial expression hardness information is information indicating whether the facial expression is "hard” or "soft.” Note that the number of classification levels is not limited to two.
  • the distortion information indicating the degree of facial distortion is information that classifies the degree of facial distortion into a plurality of levels.
  • the degree of facial distortion is classified into two levels.
  • the facial distortion information is information indicating whether the distortion is "large” or "small.” Note that the number of classification levels is not limited to two.
  • facial muscles become stiff and stiff, and their facial expressions become stiff. It is known that facial muscles become stiff when a person is tense, tired, uncomfortable, worried or stressed. Therefore, the firmness of the facial expression can be used as information suggesting the degree of tension, fatigue, discomfort, worries, and stress of a person.
  • the neural network parameter it is possible to analyze the hardness of facial expressions based on captured images of faces using a classification model generated by machine learning.
  • the neural network parameter It is possible to generate an adjusted classification model.
  • the learning data used here are images of a person's face when he/she feels tense, tired, uncomfortable, concerned, or stressed, and images of the same person's face when he/she feels tense, tired, uncomfortable, concerned, or stress-free. It is assumed that the photographed image of the face when the person is in the room is provided with label information indicating whether the facial expression is "hard” or "soft".
  • the form of the generated classification model is not limited to the neural network model. For example, any one of a regression model, a tree model, a Bayesian model, a clustering model, etc. is also possible.
  • the image analysis unit 31b acquires distortion information indicating the degree of distortion of the face by analyzing the left-right symmetry of the face.
  • the image analysis unit 31b sets a vertical virtual line passing through the center point of the nose, and determines the amount of deviation between each part of the face on the left side of the virtual line and each part of the face on the right side of the virtual line. It is possible to analyze and determine that the distortion is "large” if the magnitude of the deviation is greater than or equal to the threshold, and that the distortion is "small” if the magnitude of the deviation is less than the threshold.
  • the magnitude of the deviation of each part is obtained by generating a line-symmetrical mirror image from the facial image on the left side of the virtual line, and superimposing this mirror image on the facial image on the right side of the virtual line. can be detected as the amount of difference between the pixel positions corresponding to .
  • the amount of misalignment is detected and scored for each part such as the eyebrows, eyes, lips, cheeks, etc., and the total score is calculated by weighting and adding the scores obtained for each part.
  • the distortion of the face may be classified as either “large” or “small” depending on whether it is equal to or greater than the threshold.
  • a first type F1 with a “hard” facial expression and “large” distortion a first type F1 with a “hard” facial expression and “large” distortion
  • a second type F2 with a “soft” facial expression and a “large” distortion a second type F2 with a “soft” facial expression and a “large” distortion
  • a “soft” and “large” facial expression It is possible to classify face images into either a third type F3 with "small distortion” or a fourth type F4 with a "hard” facial expression and a "small distortion”.
  • the vibration information acquisition unit 32 selects the diagnostic information acquired by the diagnostic information acquisition unit 31 from among a plurality of pieces of vibration information related to vibrations with different tactile qualities (characteristics of vibration that cause a specific tactile sensation). Vibration information related to vibration having a tactile quality corresponding to the diagnostic information stored in the face image storage unit 30A is acquired. That is, the vibration information acquisition unit 32 acquires vibration information about vibration having a tactile quality corresponding to the hardness information and distortion information of the user's face analyzed by the image analysis unit 31b.
  • the vibration information includes a first type V1 with a “hard” and “coarse” tactile sensation, a first type V1 with a “soft” and “coarse” tactile sensation, and a second type V1 with a “soft” and “coarse” tactile sensation.
  • vibration tactile sensation types It is classified into two types V2, a third type V3 having a "soft” and “smooth” tactile sensation, and a fourth type V4 having a “hard” and “smooth” tactile sensation. These types V1 to V4 are hereinafter referred to as vibration tactile sensation types.
  • one piece of vibration information is stored in the vibration information storage unit 30B in advance for each type in a mode classified into four vibration tactile sensation types V1 to V4 in association with the hardness and roughness of the tactile sensation.
  • the vibration information is preferably low-frequency vibration information of 40 Hz or less (preferably 20 Hz or less, more preferably 10 Hz or less, still more preferably 5 Hz or less). It should be noted that audible sound such as music may be mixed with low-frequency vibration information.
  • the vibration information acquisition unit 32 obtains the diagnosis result types F1 to F4 specified by the combination of the hardness information of the user's facial expression and the facial distortion information, and the hardness (softness) and roughness of the tactile sensation of the vibration ( Vibration information of the tactile quality corresponding to the diagnosis information is acquired from the vibration information storage unit 30B based on the correspondence relationship with the vibration tactile sensation types V1 to V4 specified by the combination of the smoothness). Details of the vibration information stored in the vibration information storage unit 30B and the operation of the vibration information acquisition unit 32 will be described later.
  • the vibration information providing unit 33 provides the user terminal 100 with the vibration information acquired by the vibration information acquiring unit 32 .
  • the vibration information provided by the vibration information providing unit 33 is received by the vibration information receiving unit 14 of the user terminal 100, and is stored by the vibration information recording unit 15 in the vibration information storage unit 10B.
  • the diagnostic information providing unit 34 provides the user terminal 100 with the diagnostic information acquired by the image analysis unit 31b in association with the photographed image of the face acquired by the face image acquisition unit 31a. That is, the diagnostic information providing unit 34 provides the user terminal 100 with the diagnostic information stored in the face image storage unit 30A in association with the photographed image of the face, together with the photographing date and time information capable of specifying the photographed image of the face. do.
  • the diagnostic information provided by the diagnostic information providing unit 34 together with the photographing date/time information is received by the diagnostic information receiving unit 16 of the user terminal 100, and the diagnostic information recording unit 17 associates the face image with the face image corresponding to the photographing date/time information. It is stored in the storage unit 10A.
  • the diagnostic information providing unit 35 obtains the photographed image of the user's face obtained by the face image obtaining unit 31a and the diagnostic information (facial image storage) obtained by the image analysis unit 31b.
  • the photographed image of the face stored in the unit 30A and diagnostic information associated therewith) is provided to the medical institution terminal 200 as diagnostic information.
  • the medical institution terminal 200 accesses the server device 300, designates the user ID provided by the user, and requests acquisition of diagnostic information.
  • the medical examination information providing unit 35 generates a screen containing a list of photographed images of the user's face and diagnostic information stored in the facial image storage unit 30A in association with the user ID, and sends it to the medical institution terminal. 200 offers.
  • a screen including information similar to the screen example shown in FIG. 4 is displayed on the display of the medical institution terminal 200 as well.
  • the vibration information stored in the vibration information storage unit 30B As described above, a plurality of pieces of vibration information with different tactile qualities are classified in association with the hardness (softness) and roughness (smoothness) of the tactile sensation of the vibration information. That is, the vibration information stored in the vibration information storage section 30B has a unique tactile effect with respect to the hardness and roughness of the tactile sensation.
  • the haptic effect of vibration information is specified by the inherent tactile parameters of the vibration information.
  • the tactile parameter is a parameter representing the degree of opposing tactile qualities (hereinafter referred to as a tactile pair) such as ⁇ hard-soft> and ⁇ rough-smooth>, and constitutes one element of tactile sensation.
  • a tactile pair such as ⁇ hard-soft> and ⁇ rough-smooth>
  • the intensity of the vibration waveform first tactile parameter h
  • the tactile parameter related to the ⁇ rough-smooth> tactile pair it is possible to use the length of the divided section of the vibration waveform (second tactile parameter t). That is, the larger the value representing the length of the divided section, the smoother it is, and the smaller the value representing the length, the rougher it is.
  • FIG. 6 is a diagram for explaining two tactile parameters h and t, and schematically shows an envelope waveform (hereinafter referred to as a vibration waveform) of vibration information.
  • the vibration waveform is divided into a plurality of parts along the time axis.
  • it is divided for each time when the amplitude of the vibration waveform becomes minimum. That is, the first divided section T1 is from the start point of the vibration waveform to the first minimum value, the second divided section T2 is from the first minimum value to the second minimum value, and the second minimum
  • the vibration waveform is divided into a plurality of sections in the direction of the time axis such as a third division section T3, . . .
  • the method of dividing the vibration waveform is not limited to the example shown in FIG.
  • the vibration waveform may be divided into a plurality of sections at each time when the amplitude reaches its maximum.
  • the vibration waveform may be divided into a plurality of sections each time the amplitude value becomes zero.
  • a plurality of characteristic points that can be distinguished from other points in the vibration waveform may be extracted, and the vibration waveform may be divided into a plurality of sections for each characteristic point.
  • characteristic points F 1 , F 2 , F 3 , . . . may be extracted, and a plurality of divided sections T1, T2, T3, .
  • vibration information is configured by a pattern in which the same vibration waveform is repeated multiple times, and a plurality of divided sections T1, T2, T3, . ⁇ may be set.
  • representative amplitudes h1, h2, h3, Identify the lengths of time t1, t2, t3, .
  • the representative amplitudes h1, h2, h3, The values of the differences from the maximum values in the intervals T1, T2, T3, . . . are shown.
  • the difference between this local minimum value and the local maximum value is the representative amplitude h1.
  • the minimum value at the start point of the section is larger than the minimum value at the end point, so the difference between the minimum value and the maximum value at the start point becomes the representative amplitude h2.
  • the minimum value at the end point is larger than the minimum value at the start point of the section, so the difference between the minimum value and the maximum value at the end point is the representative amplitude h3.
  • the method of specifying the representative amplitude shown here is an example, and is not limited to this.
  • the smaller one of the minimum value of the start point or the minimum value of the end point in each divided section T1, T2, T3, . . . and the maximum value in the divided section T1, T2, T3, may be specified as the representative amplitude.
  • the positive maximum value or negative minimum value in each divided section is the first It may be specified as a representative amplitude of the tactile parameter h.
  • the absolute value of the negative minimum value may be specified as the representative amplitude of the first tactile parameter h.
  • the first tactile parameter h is, for example, the average value, maximum value, minimum value, or median value of the representative amplitudes h1, h2, h3, .
  • the second tactile parameter t is the average value, maximum value, minimum value, or median value of the time lengths t1, t2, t3, . .
  • the number of representative amplitudes h1, h2, h3, . , . . . exceeding a threshold value or the ratio to the total number may be used as the second tactile parameter t.
  • the vibration information includes the first tactile parameter h (intensity of the vibration waveform) representing the hardness of the tactile sensation of vibration, and the second tactile parameter t (the number of divided sections) representing the roughness of the tactile sensation of vibration. length), and can be classified into any of four vibrotactile sensation types V1 to V4 by the combination of the first tactile parameter h and the second tactile parameter t. .
  • the vibration information is a first type V1 in which the first tactile parameter h is equal to or greater than the first threshold Th1 and the reciprocal (1/t) of the second tactile parameter t is equal to or greater than the second threshold Th2.
  • a second type V2 in which the quality parameter h is less than the first threshold Th1 and the reciprocal of the second tactile parameter t is equal to or greater than the second threshold Th2, the first tactile parameter h is less than the first threshold Th1 and the second tactile
  • a third type V3 in which the reciprocal of the parameter t is less than the second threshold Th2
  • a fourth type V3 in which the first tactile parameter h is equal to or greater than the first threshold Th1 and the reciprocal of the second tactile parameter t is less than the second threshold Th2 It is classified into four types of type V4.
  • the vibration information acquisition unit 32 acquires the diagnosis result types F1 to F4 specified by the combination of the facial expression hardness information and the facial distortion information of the user, and the hardness and coarseness of the tactile sensation of the vibration information.
  • Vibration information of the tactile quality corresponding to the diagnosis information of the user's face is obtained from the vibration information storage unit 30B based on the corresponding relationship with the vibration tactile sensation types V1 to V4 specified by the combination of the tactile sensations.
  • the vibration information acquisition unit 32 selects the diagnosis result type F1 so that both the hardness of the facial expression and the facial distortion of the user approach a moderate state (the state of the position indicated by Fc in FIG. 5A).
  • Vibration information belonging to the vibration tactile sensation types V1 to V4 associated with F4 is obtained from the vibration information storage unit 30B.
  • the diagnosis result types F1 to F4 and the vibration tactile sensation types V1 to V4 are associated as ⁇ F1-V3>, ⁇ F2-V4>, ⁇ F3-V1> and ⁇ F4-V2>.
  • This provides vibration information of a third type V3 on the opposite side when the diagnostic information is of the first type F1, and provides vibration information of a fourth type V4 on the opposite side when the diagnostic information is of the second type F2.
  • the vibration information of the first type V1 on the opposite electrode side is provided, and when the diagnostic information is the fourth type F4, the vibration information of the second type V2 on the opposite electrode side is provided. means to provide.
  • the diagnostic information is of the first type F1
  • the diagnostic information is of the third type V3.
  • Vibration information that is, vibration information having a "soft” and “smooth” tactile sensation, is provided to the user. This vibration information acts to soften the hard expression of the user's face and to smooth and reduce the distortion of the user's face. When the user actually applies this vibration to the face, it is expected that the state of the face belonging to the first type F1 approaches the moderate state Fc.
  • hardness information indicating the degree of hardness of the user's facial expression and distortion indicating the degree of facial distortion are used as diagnostic information regarding the user's physical state or mental state.
  • Information is acquired, and vibration information having a tactile quality corresponding to the diagnostic information is acquired from among a plurality of pieces of vibration information having different tactile qualities, and is provided to the user.
  • vibration information regarding vibrations that produce an appropriate tactile sensation according to the user's physical or mental state For example, for a user with a certain symptom, vibration information that produces an effective tactile sensation from the viewpoint of alleviating or improving the symptom according to diagnostic information on the physical condition or mental condition that appears on the face due to the symptom. can be provided.
  • the first embodiment provides a mechanism for sharing the photographed image of the user's face and diagnostic information between the user and the medical institution.
  • the user can use the shared information to have a dialogue with the medical institution when using the vibration information to improve the disorder and symptoms, and it is also possible to receive appropriate advice from the medical institution. be.
  • the diagnostic information is classified into four diagnostic result types F1 to F4 from the two viewpoints of the hardness of the user's facial expression and the degree of facial distortion, and the hardness of the tactile sensation is classified into four types.
  • vibration information is classified into four vibration tactile sensation types V1 to V4 from the two viewpoints of roughness and roughness
  • the present invention is not limited to this.
  • the diagnostic information is classified into two diagnostic result types F1′ and F2′ from the viewpoint of the hardness of the facial expression of the user, and the vibration information is divided into two types from the viewpoint of the hardness of the tactile sensation. It may be classified into vibration tactile sensation types V1' and V2'. Further, as shown in FIG.
  • the diagnostic information is classified into two diagnostic result types F1'' and F2'' from the viewpoint of the degree of distortion of the user's face, and the vibration information is divided into two types from the viewpoint of the roughness of the tactile sensation. It may be classified into vibration tactile sensation types V1′′ and V2′′.
  • the face image captured by the face image capturing unit 11 may be a moving image.
  • the image analysis shown in the above embodiment is executed for each frame, and index values relating to at least one of changes in the degree of facial expression hardness and changes in the degree of facial distortion are calculated based on the physical state of the user. Or you may make it acquire as diagnostic information regarding a mental state.
  • FIG. 10 is a block diagram showing a functional configuration example of the user terminal 100 according to the second embodiment.
  • the parts denoted by the same reference numerals as those shown in FIG. 2 have the same functions, and redundant description will be omitted here.
  • the user terminal 100 includes the face image photographing unit 11, the face image recording unit 12, the face image transmitting unit 13, the diagnostic information receiving unit 16, and the diagnostic information recording unit shown in FIG.
  • an inquiry execution unit 21 an answer information recording unit 22, an answer information transmission unit 23, a diagnosis information reception unit 26, a diagnosis information recording unit 27, and an answer information display It has a section 29 and an answer information storage section 20A.
  • the medical inquiry execution unit 21 provides the user with medical inquiry information for performing self-diagnosis on a specific symptom, and acquires answer information indicating the answer input by the user to the medical inquiry information.
  • Particular symptoms are, for example, depression, dementia, insomnia, and the like.
  • Self-diagnosis includes, for example, SDS (Self-rating Depression Scale), SDC (Symbol Digit Coding), EQ5D (EuroQol 5 Dimension), QIDS (Quick Inventory of Depressive Symptomatology), BDI (Beck Depression Inventory), CES-D (Center for Epidemiologic Studies Depression Scale), DSM-IV (The Diagnostic and Statistical Manual of Mental Disorders), EPDS (Edinburgh Postnatal Depression Scale), MMSE (Mini Mental State Examination:), HDS-R (Hasegawa's) Dementia Scale-Revised: Revised Hasegawa Simple Intelligence Assessment Scale), ADAS-cog (Alzheimer's Disease Assessment Scale-cognitive subscale),
  • the medical interview execution unit 21 outputs a message prompting the medical interview to the display at a predetermined timing in accordance with an appropriate predetermined cycle (for example, every other day cycle) according to the self-assessment scale used in the medical interview. Then, the medical inquiry execution unit 21 presents the medical inquiry information to the user through a predetermined medical inquiry screen in accordance with the instruction operation by the user who has seen this message. The inquiry executing unit 21 inputs the user's answer information to the inquiry information presented through the inquiry screen.
  • the answer information recording unit 22 stores the answer information obtained by executing the inquiry by the inquiry execution unit 21 in the answer information storage unit 20A together with the information indicating the date and time of the inquiry. Storing the answer information together with the implementation date/time information includes storing the answer information in a manner in which the implementation date/time information is included as one of the attribute information accompanying the text data (text file) of the answer.
  • the response information transmission unit 23 transmits the response information acquired by the medical inquiry execution unit 21 to the server device 300 together with the date and time information of the medical inquiry and the user ID.
  • the diagnostic information receiving unit 26 receives the diagnostic information provided from the server device 300 in association with the answer information of the inquiry. Diagnosis information and inquiry response information are associated based on the user ID and inquiry date and time information.
  • the diagnostic information recording unit 27 causes the diagnostic information received by the diagnostic information receiving unit 26 to be stored in the answer information storage unit 20A in association with the answer information of the medical interview targeted for diagnosis.
  • Questionnaire response information that is the target of diagnosis is specified based on the date and time information of the inquiry.
  • the answer information display unit 29 causes the display to display the inquiry answer information stored in the answer information storage unit 20A and diagnostic information (symptom degree information described later) associated with the answer information.
  • FIG. 12 is a diagram showing an example of a screen displayed on the display by the answer information display section 29. As shown in FIG. As shown in FIG. 12 , the answer information display unit 29 displays one or more sets of information including the answer information of the medical inquiry, the information indicating the date and time of the medical inquiry, and the diagnostic information acquired from the server device 300 as one piece of set information. are listed on the screen.
  • FIG. 11 is a block diagram showing a functional configuration example of the server device 300 according to the second embodiment.
  • the same reference numerals as those shown in FIG. 3 have the same functions, and redundant description will be omitted here.
  • the server device 300 includes the diagnostic information acquiring unit 31, the vibration information acquiring unit 32, the diagnostic information providing unit 34, the diagnostic information providing unit 35, and the facial image shown in FIG.
  • a diagnostic information acquisition section 41, a vibration information acquisition section 42, a diagnostic information provision section 44, a medical examination information provision section 45, an answer information storage section 40A and a vibration information storage section 40B are provided.
  • the server device 300 according to the second embodiment further includes an answer information acquisition unit 46 .
  • the response information acquisition unit 46 acquires the response information transmitted by the response information transmission unit 23 of the user terminal 100, and stores it in the response information storage unit 40A together with the date and time information of the medical interview and the user ID.
  • the diagnostic information acquisition unit 41 analyzes the response information acquired by the response information acquisition unit 46, and obtains symptom degree information indicating the user's self-diagnosis result regarding the degree of specific symptoms related to the user's physical condition or mental condition. is acquired as diagnostic information, and stored in the answer information storage unit 40A in association with the answer information of the inquiry and the user ID.
  • the diagnostic information acquisition unit 41 calculates the score of the self-assessment scale based on the answer information, and classifies the degree of symptoms into a plurality of levels according to the score.
  • the degree of symptoms is classified into two levels of "large” and “small” depending on whether the score of the self-assessment scale is equal to or higher than the threshold.
  • Information indicating any level classified in this way is symptom degree information.
  • the vibration information acquiring unit 42 selects diagnostic information acquired by the diagnostic information acquiring unit 41 (diagnostic information stored in the answer information storage unit 40A in association with question answer information) from among a plurality of pieces of vibration information related to vibrations having different tactile properties. ) to acquire vibration information about vibration having a tactile quality according to That is, the vibration information acquisition unit 42 acquires vibration information regarding vibration having a tactile quality corresponding to the symptom degree information acquired by the vibration information acquisition unit 42 .
  • the plurality of pieces of vibration information related to vibrations with different tactile qualities are classified in association with the hardness (softness) or roughness (smoothness) of the tactile sensation of the vibrations.
  • the vibration information is pre-stored in the vibration information storage unit 40B in a manner classified into a first type V1′ with a “hard” tactile sensation and a second type V2′ with a “soft” tactile sensation.
  • the vibration information is stored in advance in the vibration information storage unit 40B in a manner classified into a first type V1 with a "rough” tactile sensation and a second type V2 with a "smooth” tactile sensation. remembered.
  • the vibration information acquisition unit 42 When the symptom degree information acquired by the diagnostic information acquisition unit 41 indicates that the degree of the symptom is "high”, the vibration information acquisition unit 42 performs the operation shown in FIG. 8 ( In the case of b), the vibration information of the first type V1′ is acquired from the vibration information storage unit 40B, and in the case of FIG. If the symptom level information indicates that the symptom level is "small", the vibration information acquisition unit 42 selects the second type V2 in the case of FIG. ', and in the case of FIG. 9B, the vibration information of the second type V2'' is acquired from the vibration information storage unit 40B.
  • the diagnostic information providing unit 44 provides the user terminal 100 with the diagnostic information (symptom degree information) acquired by the diagnostic information acquiring unit 41 in association with the answer information acquired by the answer information acquiring unit 46 . That is, the diagnostic information providing unit 44 provides the user terminal 100 with the diagnostic information stored in the answer information storage unit 40A in association with the answer information of the medical inquiry, together with the date and time information of the medical inquiry that can specify the answer information. do.
  • the diagnostic information provided by the diagnostic information providing unit 44 together with the implementation date and time information is received by the diagnostic information receiving unit 26 of the user terminal 100, and the diagnostic information recording unit 27 associates the response information with the response information corresponding to the implementation date and time information. It is stored in the storage unit 20A.
  • the medical examination information providing unit 45 stores the answer information acquired by the answer information acquisition unit 46 and the diagnostic information acquired by the diagnostic information acquisition unit 41 (stored in the answer information storage unit 40A), for example, in response to a request from the medical institution terminal 200.
  • the stored answer information and symptom degree information associated therewith) are provided to the medical institution terminal 200 as examination information.
  • symptom degree information indicating a self-diagnosis result based on a rating scale corresponding to a specific symptom is acquired as diagnostic information about the user's physical condition or mental condition.
  • Vibration information having a tactile quality corresponding to the symptom degree information is acquired from among a plurality of pieces of vibration information having different degrees of symptoms, and is provided to the user.
  • the second embodiment configured in this way, it is possible to provide the user with appropriate vibration information having a tactile quality corresponding to the degree of symptoms (results of self-diagnosis) related to the user's physical condition or mental condition. .
  • vibration information that produces an effective tactile sensation from the viewpoint of alleviating or improving the symptom according to diagnostic information regarding the degree of the symptom.
  • the second embodiment provides a mechanism for sharing the answer information and diagnosis information of the medical interview between the user and the medical institution.
  • the user can use the shared information to have a dialogue with the medical institution when using the vibration information to improve the disorder and symptoms, and it is also possible to receive appropriate advice from the medical institution. be.
  • FIG. 13 is a block diagram showing a functional configuration example of the user terminal 100 according to the third embodiment.
  • the components denoted by the same reference numerals as those shown in FIG. 10 have the same functions, and redundant description will be omitted here.
  • FIG. 14 is a block diagram showing a functional configuration example of the server device 300 according to the third embodiment.
  • the same reference numerals as those shown in FIG. 11 have the same functions, and redundant description will be omitted here.
  • FIG. 15 is a block diagram showing a functional configuration example of the medical institution terminal 200 according to the third embodiment.
  • the functional configuration of the medical institution terminal 200 according to the third embodiment includes a medical examination information input section 71, a medical examination information recording section 72, a medical examination information transmitting section 73, a diagnostic information receiving section 76, and a diagnostic information recording section 77. and an examination information display section 79 .
  • the medical institution terminal 200 according to the third embodiment includes a medical examination information storage unit 70A as a storage medium.
  • the medical examination information input unit 71 inputs medical examination information indicating the results of medical examinations performed on the user by doctors of medical institutions regarding specific symptoms. Medical examinations performed by a doctor include medical examinations by interviewing the user, medical examinations by observing the inside or outside of the user's body using medical equipment, and medical examinations by measuring the user's biometric information using measuring equipment. .
  • the medical examination information input unit 71 acquires medical examination information input by a doctor by operating an input device such as a keyboard. Further, the medical examination information input unit 71 may input biological information measured by a measuring device as part of the medical examination information.
  • the medical examination information recording unit 72 stores the medical examination information input by the medical examination information input unit 71 in the medical examination information storage unit 70A together with the medical examination date and time information.
  • the medical examination information transmission unit 73 transmits the medical examination information input by the medical examination information input unit 71 to the server device 300 together with the medical examination date/time information and the user ID.
  • the diagnostic information receiving unit 76 receives diagnostic information (symptom degree information described later) provided from the server device 300 in association with the medical examination information. Diagnosis information and consultation information are associated based on the user ID and consultation date/time information.
  • the diagnostic information recording section 77 stores the diagnostic information received by the diagnostic information receiving section 76 in the medical examination information storage section 70A in association with the medical examination information that is the target of the diagnosis. The examination information targeted for diagnosis is specified based on the examination date and time information.
  • the medical examination information display unit 79 displays the medical examination information stored in the medical examination information storage unit 70A and the diagnostic information (symptom degree information) associated with the medical examination information on the display.
  • the screen displayed on the display by the examination information display unit 79 is a list of one or more pieces of set information, for example, including examination information, examination date and time information, and diagnosis information.
  • the diagnosis information may be added to the chart screen displaying the consultation information and the consultation date/time information.
  • the server device 300 includes the answer information acquiring unit 46, the diagnostic information acquiring unit 41, the diagnostic information providing unit 44, the diagnostic information providing unit 45, and the answer information shown in FIG.
  • a diagnosis information acquisition unit 66 a diagnosis information acquisition unit 61, a diagnosis information provision unit 64, a diagnosis result information provision unit 65, and a diagnosis information storage unit 60A are provided.
  • the consultation information acquisition unit 66 acquires the consultation information transmitted by the consultation information transmission unit 73 of the medical institution terminal 200, and stores it in the consultation information storage unit 60A together with the consultation date/time information and the user ID.
  • the diagnostic information acquisition unit 61 analyzes the medical examination information acquired by the medical examination information acquisition unit 66 to acquire symptom level information indicating the degree of specific symptoms related to the user's physical condition or mental condition as diagnostic information.
  • the information is stored in the medical examination information storage unit 60A in association with the information and the user ID.
  • the diagnostic information acquisition unit 61 classifies the degree of symptoms into a plurality of levels based on the medical examination information, and acquires information indicating one of the classified levels as symptom degree information.
  • the degree of symptoms is classified into two levels of "large” and "small".
  • Classification according to the degree of symptoms based on medical examination information can be performed, for example, using a classification model generated by machine learning. For example, by performing machine learning using medical examination information obtained from multiple people as learning data, a classification in which neural network parameters are adjusted so that symptom degree information is output when medical examination information is input. It is possible to generate a model. Instead of the neural network model, it is also possible to generate a classification model in any form of a regression model, a tree model, a Bayesian model, a clustering model, and the like.
  • the diagnostic information providing unit 64 provides the medical institution terminal 200 with the diagnostic information (symptom degree information) acquired by the diagnostic information acquiring unit 61 in association with the medical examination information acquired by the medical examination information acquiring unit 66 . That is, the diagnostic information providing unit 64 provides the medical institution terminal 200 with the diagnostic information stored in the medical examination information storage unit 60A in association with the medical examination information together with the medical examination date and time information that can specify the medical examination information.
  • the diagnostic information provided by the diagnostic information providing unit 64 together with the examination date and time information is received by the diagnostic information receiving unit 76 of the medical institution terminal 200, and the diagnostic information recording unit 77 associates the examination date and time information with the examination information corresponding to the examination date information. It is stored in the information storage unit 70A.
  • the diagnostic result information providing unit 65 receives the medical examination information acquired by the medical examination information acquiring unit 66 and the diagnostic information acquired by the diagnostic information acquiring unit 61 (the medical examination information stored in the medical examination information storage unit 60A and the symptom degree associated therewith). information) is provided to the user terminal 100 as diagnosis result information.
  • the user terminal 100 according to the third embodiment does not include the medical interview execution unit 21, answer information recording unit 22, and answer information transmission unit 23 shown in FIG. 10 as functional configurations. Further, in the user terminal 100 according to the third embodiment, instead of the diagnostic information receiving section 26, the diagnostic information recording section 27, the answer information display section 29, and the answer information storage section 20A shown in FIG. 56, a diagnostic result information recording unit 57, a diagnostic result information display unit 25, and a diagnostic result information storage unit 50A.
  • the diagnostic result information receiving unit 56 receives diagnostic result information provided from the server device 300 .
  • the diagnostic result information recording unit 57 stores the diagnostic result information received by the diagnostic result information receiving unit 56 in the diagnostic result information storage unit 50A.
  • the diagnostic result information display unit 59 displays the diagnostic result information stored in the diagnostic result information storage unit 50A on the display.
  • the third embodiment configured as described above, it is possible to provide the user with appropriate vibration information having a tactile quality corresponding to the degree of symptoms related to the user's physical condition or mental condition (result of medical examination by a doctor). can. Further, in the third embodiment, a mechanism is provided for sharing between the user and the medical institution the examination information indicating the examination result by the doctor and the diagnostic information (symptom degree information) analyzed based on the examination result. there is As a result, the user can use the shared information to have a dialogue with the medical institution when using the vibration information to improve the disorder and symptoms, and it is also possible to receive appropriate advice from the medical institution. be.
  • the vibration waveform (substance of vibration) that is the source of vibration is stored as vibration information in the vibration information storage units 30B and 40B, and the vibration is transmitted from the server device 300 to the user terminal 100.
  • the vibration information storage units 30 ⁇ /b>B and 40 ⁇ /b>B may store vibration identification information for specifying the entity of vibration as vibration information, and provide the vibration identification information from the server device 300 to the user terminal 100 .
  • the user terminal 100 accesses the server device 300 or another server device and downloads the substance of the vibration by making a request using the vibration identification information.
  • the vibration identification information may be a URL (Uniform Resource Locator) for accessing the download destination.
  • one piece of vibration information is stored in the vibration information storage unit 30B for each type. , 40B, and obtains one piece of vibration information belonging to the vibration tactile sensation type associated with the diagnosis result type from the vibration information storage units 30B and 40B, but the present invention is not limited to this. .
  • a plurality of pieces of vibration information are stored in advance in the vibration information storage units 30B and 40B for one vibration tactile sensation type, and any one of the plurality of pieces of vibration information belonging to the vibration tactile sensation type associated with the diagnosis result type is selected.
  • One may be selected and acquired from the vibration information storage units 30B and 40B. The selection in this case may be automatically performed by the vibration information acquisition units 32 and 42, or may be manually performed by the user.
  • vibration information acquisition units 32 and 42 automatically select, for example, it is possible to select randomly. Alternatively, selection is made so that the same vibration information is not continuously provided to the same user every time a photographed image of the face or answer information to an inquiry is sent from one user. good too. For example, if the diagnostic information is continuously classified into the same diagnostic result type, vibration information that is different from the previously selected vibration information among a plurality of vibration information belonging to the vibration tactile sensation type corresponding to the diagnostic result type. to get
  • the user terminal 100 is presented with vibration identification information representing a plurality of pieces of vibration information belonging to the vibration tactile sensation type associated with the diagnosis result type. It is possible to select vibration information (substance of vibration) corresponding to the vibration identification information selected by . Along with the vibration identification information, comment information describing the tactile sensation of the vibration and the like may be presented to the user.
  • the hardness information indicating the degree of facial expression hardness and the distortion information indicating the degree of facial distortion are used as diagnostic information regarding the user's physical condition or mental condition.
  • symptom degree information indicating a diagnosis result by oneself or by a doctor regarding the degree of a specific symptom has been described, but the present invention is not limited to this.
  • the user's biometric information measured regarding the user's physical condition or mental condition deviation degree information indicating the degree of deviation from the standard value or normal value of the biometric information is used as diagnostic information, and a touch corresponding to the deviation degree information is used.
  • Qualitative vibration information may be provided.
  • symptom degree information or the like may be acquired as diagnostic information by analyzing the user's physical condition or mental condition based on the audio information of the user's voice.
  • the configuration in which the server device 300 includes the diagnostic information acquiring units 31, 41, the vibration information acquiring units 32, 42, and the vibration information providing unit 33 has been described.
  • the user terminal 100 may include the diagnostic information acquisition units 31 and 41 and transmit diagnostic information from the user terminal 100 to the server device 300 .
  • the user terminal 100 may be provided with all the functional configurations provided by the server device 300 .
  • the vibration information providing unit 33 presents the vibration identification information to the user, accesses the server device 300 or another server device from the user terminal 100, and makes a request using the vibration identification information, You may make it download the substance of a vibration.
  • the configuration including the medical institution terminal 200 has been described, but the first and second embodiments may be configured without the medical institution terminal 200.
  • Diagnosis information storage unit 10A face image storage unit 10B vibration information storage unit 11 face image photographing unit 12 face image recording unit 13 face image transmission unit 14 vibration information reception unit 15 vibration information recording unit 16 diagnosis information reception unit 17 diagnosis information recording unit 18 vibration imparting unit 19 Face image display unit 20A Face image storage unit 21 Interview execution unit 22 Answer information recording unit 23 Answer information transmission unit 26 Diagnosis information reception unit 27 Diagnosis information recording unit 29 Answer information display unit 30A Face image storage unit 30B Vibration information storage unit 31 Diagnosis Information Acquisition Part 31a Face Image Acquisition Part 31b Image Analysis Part 32 Vibration Information Acquisition Part 33 Vibration Information Provision Part 34 Diagnosis Information Provision Part 35 Diagnostic Information Provision Part 40A Answer Information Storage Part 40B Vibration Information Storage Part 41 Diagnosis Information Acquisition Part 42 Vibration Information acquisition unit 44 Diagnosis information provision unit 45 Diagnosis information provision unit 46 Answer information acquisition unit 50A Diagnosis result information storage unit 56 Diagnosis result information reception unit 57 Diagnosis result information recording unit 59 Diagnosis result information display

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

L'invention concerne un système de fourniture d'informations de vibration qui est équipé d'une unité d'acquisition d'informations de diagnostic (31) pour acquérir des informations de diagnostic relatives à l'état physique ou à l'état mental d'un utilisateur, d'une unité d'acquisition d'informations de vibration (32) pour acquérir des informations de vibration relatives à des vibrations comportant une qualité tactile correspondant aux informations de diagnostic acquises par l'unité d'acquisition d'informations de diagnostic (31), parmi la pluralité des unités d'informations de vibration concernant des vibrations comportant différentes qualités tactiles, qui constituent les propriétés de vibrations provoquant une sensation tactile spécifique, et d'une unité de fourniture d'informations de vibration (33) pour fournir à l'utilisateur des informations de vibration acquises par l'unité d'acquisition d'informations de vibration (32) ; les informations de vibration qui comportent la qualité tactile correspondant aux informations de diagnostic relatives à l'état physique ou à l'état mental de l'utilisateur, sont acquises par la pluralité des unités d'informations de vibration qui comportent différentes qualités tactiles, et sont fournies à l'utilisateur, et ce faisant, par exemple, permettent de fournir des informations de vibration concernant des vibrations provoquant une sensation tactile spécifique destinée à atténuer ou à améliorer certains symptômes d'un utilisateur présentant lesdits symptômes, selon des informations de diagnostic concernant un état physique ou un état mental provoqué par lesdits symptômes.
PCT/JP2021/009158 2021-03-09 2021-03-09 Système de fourniture d'informations de vibration, dispositif serveur de fourniture d'informations de vibration, dispositif d'acquisition d'informations de vibration, procédé de fourniture d'informations de vibration, programme pour la fourniture d'informations de vibration et programme pour l'acquisition d'informations de vibration WO2022190187A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2023504902A JP7554515B2 (ja) 2021-03-09 2021-03-09 振動情報提供システム、振動情報提供サーバ装置、振動情報取得装置、振動情報提供方法、振動情報提供用プログラムおよび振動情報取得用プログラム
PCT/JP2021/009158 WO2022190187A1 (fr) 2021-03-09 2021-03-09 Système de fourniture d'informations de vibration, dispositif serveur de fourniture d'informations de vibration, dispositif d'acquisition d'informations de vibration, procédé de fourniture d'informations de vibration, programme pour la fourniture d'informations de vibration et programme pour l'acquisition d'informations de vibration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/009158 WO2022190187A1 (fr) 2021-03-09 2021-03-09 Système de fourniture d'informations de vibration, dispositif serveur de fourniture d'informations de vibration, dispositif d'acquisition d'informations de vibration, procédé de fourniture d'informations de vibration, programme pour la fourniture d'informations de vibration et programme pour l'acquisition d'informations de vibration

Publications (1)

Publication Number Publication Date
WO2022190187A1 true WO2022190187A1 (fr) 2022-09-15

Family

ID=83227539

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/009158 WO2022190187A1 (fr) 2021-03-09 2021-03-09 Système de fourniture d'informations de vibration, dispositif serveur de fourniture d'informations de vibration, dispositif d'acquisition d'informations de vibration, procédé de fourniture d'informations de vibration, programme pour la fourniture d'informations de vibration et programme pour l'acquisition d'informations de vibration

Country Status (2)

Country Link
JP (1) JP7554515B2 (fr)
WO (1) WO2022190187A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004357821A (ja) * 2003-06-03 2004-12-24 Tanita Corp ストレス診断システム
JP2010530281A (ja) * 2007-06-21 2010-09-09 イマージョン コーポレーション 触覚フィードバックを用いた健康状態の監視
JP2016523139A (ja) * 2013-06-06 2016-08-08 トライコード ホールディングス,エル.エル.シー. モジュール型生理学的モニタリング・システム、キット、および方法
JP2020120908A (ja) * 2019-01-30 2020-08-13 パナソニックIpマネジメント株式会社 精神状態推定システム、精神状態推定方法、及び、プログラム

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8004391B2 (en) 2008-11-19 2011-08-23 Immersion Corporation Method and apparatus for generating mood-based haptic feedback
US20160262690A1 (en) 2015-03-12 2016-09-15 Mediatek Inc. Method for managing sleep quality and apparatus utilizing the same
JP6742196B2 (ja) 2016-08-24 2020-08-19 Cyberdyne株式会社 生体活動検出装置および生体活動検出システム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004357821A (ja) * 2003-06-03 2004-12-24 Tanita Corp ストレス診断システム
JP2010530281A (ja) * 2007-06-21 2010-09-09 イマージョン コーポレーション 触覚フィードバックを用いた健康状態の監視
JP2016523139A (ja) * 2013-06-06 2016-08-08 トライコード ホールディングス,エル.エル.シー. モジュール型生理学的モニタリング・システム、キット、および方法
JP2020120908A (ja) * 2019-01-30 2020-08-13 パナソニックIpマネジメント株式会社 精神状態推定システム、精神状態推定方法、及び、プログラム

Also Published As

Publication number Publication date
JP7554515B2 (ja) 2024-09-20
JPWO2022190187A1 (fr) 2022-09-15

Similar Documents

Publication Publication Date Title
Gürkök et al. Brain–computer interfaces for multimodal interaction: a survey and principles
US10475351B2 (en) Systems, computer medium and methods for management training systems
US20170352283A1 (en) Self-administered evaluation and training method to improve mental state
Kim A SWOT analysis of the field of virtual reality rehabilitation and therapy
JP2019513516A (ja) 人の視覚パフォーマンスを査定するために視覚データを入手し、集計し、解析する方法およびシステム
Tan et al. Methodology for maximizing information transmission of haptic devices: A survey
US9814423B2 (en) Method and system for monitoring pain of users immersed in virtual reality environment
US11779275B2 (en) Multi-sensory, assistive wearable technology, and method of providing sensory relief using same
Meusel Exploring mental effort and nausea via electrodermal activity within scenario-based tasks
Guillen-Sanz et al. A systematic review of wearable biosensor usage in immersive virtual reality experiences
Ramadan et al. Unraveling the potential of brain-computer interface technology in medical diagnostics and rehabilitation: A comprehensive literature review
Amini Gougeh et al. Towards instrumental quality assessment of multisensory immersive experiences using a biosensor-equipped head-mounted display
WO2022190187A1 (fr) Système de fourniture d'informations de vibration, dispositif serveur de fourniture d'informations de vibration, dispositif d'acquisition d'informations de vibration, procédé de fourniture d'informations de vibration, programme pour la fourniture d'informations de vibration et programme pour l'acquisition d'informations de vibration
Fortin et al. Laughter and tickles: Toward novel approaches for emotion and behavior elicitation
KR101691720B1 (ko) 터치스크린을 이용한 주의력결핍 과잉행동 진단 시스템
Elor Development and evaluation of intelligent immersive virtual reality games to assist physical rehabilitation
Akdağ et al. Measuring tactile sensitivity and mixed-reality-assisted exercise for carpal tunnel syndrome by ultrasound mid-air haptics
Grant Behavioural and neurophysiological measures of haptic feedback during a drilling simulation
Şahinol Collecting Data and the Status of the Research Subject in Brain-Machine Interface Research in Chronic Stroke Rehabilitation
JP2021083654A (ja) 注意機能訓練システム、注意機能訓練方法、訓練処理装置、及びコンピュータプログラム
Templeton Towards a Comprehensive Mobile-Based Neurocognitive Digital Health Assessment System (NDHAS)
Miri et al. Emotion Regulation in the Wild: The WEHAB Approach
Donekal Chandrashekar Extended Reality Simulator for Advanced Training Life Support System
Miri Using Technology to Regulate Affect: A Multidisciplinary Perspective
Mohamad et al. Study on elicitation and detection of emotional states with disabled users

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21929431

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023504902

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21929431

Country of ref document: EP

Kind code of ref document: A1