Nothing Special   »   [go: up one dir, main page]

WO2024219300A1 - Engagement estimation method, program, and engagement estimation system - Google Patents

Engagement estimation method, program, and engagement estimation system Download PDF

Info

Publication number
WO2024219300A1
WO2024219300A1 PCT/JP2024/014490 JP2024014490W WO2024219300A1 WO 2024219300 A1 WO2024219300 A1 WO 2024219300A1 JP 2024014490 W JP2024014490 W JP 2024014490W WO 2024219300 A1 WO2024219300 A1 WO 2024219300A1
Authority
WO
WIPO (PCT)
Prior art keywords
worker
engagement
feature
information
period
Prior art date
Application number
PCT/JP2024/014490
Other languages
French (fr)
Japanese (ja)
Inventor
健一 入江
洋介 井澤
若正 清崎
拓磨 白井
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2024219300A1 publication Critical patent/WO2024219300A1/en

Links

Images

Definitions

  • the present disclosure relates generally to an engagement estimation method, program, and engagement estimation system, and more specifically to an engagement estimation method, program, and engagement estimation system that estimates a worker's engagement with their work.
  • engagement is defined in terms of the following two points: (1) commitment to the organization, specifically, affective commitment (emotional attachment to the organization) and continuance commitment (desire to remain with the organization); and (2) extra-role behavior (any behavior that enables the organization to function effectively).
  • Increased engagement can, for example, lead to increased productivity, sales, customer satisfaction, and worker retention within an organization.
  • Patent Documents 1 and 2 Traditionally, engagement has been quantified by conducting surveys of workers and analyzing their responses (e.g., Patent Documents 1 and 2).
  • the present disclosure aims to provide an engagement estimation method, program, and engagement estimation system that can reduce the burden on workers required to estimate engagement.
  • An engagement estimation method is a method executed by an engagement estimation system to estimate a worker's engagement with work.
  • the engagement estimation method has a first step, a second step, a third step, a fourth step, and a fifth step.
  • a questioning step is carried out at multiple time points.
  • the questioning step multiple images corresponding one-to-one to multiple moods are displayed, and the worker is asked to select an image from the multiple images that matches the worker's mood.
  • answer information regarding which image was selected from the multiple images at each of the multiple time points is stored.
  • multiple feature amounts are determined.
  • a regression equation is obtained that represents the relationship between the multiple feature amounts in a first period and the engagement of the worker in the first period that has been determined in advance.
  • the engagement of the worker in the second period is estimated based on the multiple feature amounts in a second period different from the first period and the regression equation.
  • the multiple feature amounts include at least one of a feature amount based on biometric information of the worker measured by a biometric information measuring terminal, a feature amount based on location information of the worker at the worker's workplace, a feature amount based on relationship information regarding a relationship between the worker and another worker in the workplace, and a feature amount based on a usage history of a computer system used by the worker for the work.
  • the multiple feature amounts further include a feature amount based on the response information.
  • a program according to one aspect of the present disclosure is a program for causing one or more processors of a computer system to execute the engagement estimation method.
  • An engagement estimation system estimates a worker's engagement with work.
  • the engagement estimation system includes an acquisition unit, a memory unit, a feature determination unit, a regression unit, and an estimation unit.
  • the acquisition unit acquires answer information from a worker terminal.
  • the worker terminal performs a questioning step at multiple time points. In the questioning step, multiple images corresponding one-to-one to multiple moods are displayed, and the worker is prompted to select an image from the multiple images that matches the worker's mood.
  • the answer information is information regarding which image was selected from the multiple images at each of the multiple time points.
  • the memory unit stores the answer information.
  • the feature determination unit determines multiple feature values.
  • the regression unit acquires a regression equation representing the relationship between the multiple feature values in a first period and the worker's engagement in the first period that has been determined in advance.
  • the estimation unit estimates the engagement of the worker in a second period different from the first period based on the multiple feature amounts in the second period and the regression equation.
  • the multiple feature amounts include at least one of a feature amount based on biometric information of the worker measured by a biometric information measuring terminal, a feature amount based on location information of the worker at the worker's workplace, a feature amount based on relationship information regarding a relationship between the worker and another worker in the workplace, and a feature amount based on a usage history of a computer system used by the worker for the work.
  • the multiple feature amounts further include a feature amount based on the response information.
  • FIG. 1 is a block diagram of an engagement estimation system and related components according to one embodiment.
  • FIG. 2 is a graph illustrating a process in the engagement estimation system.
  • FIG. 3 is an explanatory diagram showing an engagement estimation process performed by the engagement estimation system.
  • FIG. 4 is an explanatory diagram showing an operation window displayed on a PC in the engagement estimation method using the engagement estimation system.
  • FIG. 5 is a graph showing an example of response information acquired by the engagement estimation system.
  • FIG. 6 is a graph showing an example of the amount of change in response information acquired by the engagement estimation system.
  • FIG. 7 is a flowchart showing a process flow of a part of an engagement estimation method using the engagement estimation system.
  • FIG. 1 shows a schematic configuration of an engagement estimation system 1 of this embodiment.
  • the engagement estimation system 1 is used to estimate the engagement of workers.
  • “worker” refers to anyone who works in general. Unlike “laborer” in the general sense, “worker” in the present disclosure refers not only to those who receive remuneration in exchange for labor, but also to those who work without remuneration.
  • the engagement estimation system 1 is used, for example, in companies, government offices, or organizations. In the following, as a representative example, a case where the engagement estimation system 1 is used in a company will be described.
  • Engagement estimation system 1 of this embodiment shown in FIG. 1 estimates a worker's engagement with work.
  • Engagement estimation system 1 includes a (first) acquisition unit 21, a memory unit 11, a feature determination unit 23, a regression unit 24, and an estimation unit 25.
  • the acquisition unit 21 acquires answer information from a worker terminal (PC 8).
  • the worker terminal performs a question step at multiple time points. In the question step, multiple images corresponding one-to-one to multiple moods are displayed (see FIG. 4), and the worker is prompted to select an image that corresponds to the worker's mood from the multiple images.
  • the answer information is information regarding which image was selected from the multiple images at each of the multiple time points.
  • the memory unit 11 stores the answer information.
  • the feature determination unit 23 determines multiple feature values.
  • the regression unit 24 acquires a regression equation (see FIG. 2) that represents the relationship between the multiple feature values in a first period (see FIG. 3) and the worker's engagement in the first period that has been determined in advance.
  • the estimation unit 25 estimates the worker's engagement in a second period (see FIG. 3) different from the first period, based on a plurality of feature amounts in the second period and a regression equation.
  • the plurality of feature amounts include at least one of a feature amount based on the worker's biometric information measured by the biometric information measurement terminal 3, a feature amount based on the worker's location information in the worker's workplace, a feature amount based on relationship information regarding the relationship between the worker and another worker in the workplace, and a feature amount based on the usage history of a computer system (PC 8) used by the worker for work.
  • the plurality of feature amounts further includes a feature amount based on response information.
  • the worker selects an image that fits the worker's mood from among multiple images, and the engagement estimation system 1 estimates the worker's engagement.
  • the worker can intuitively select an image that fits their current mood. This reduces the burden on the worker in answering questions. For example, in this embodiment, the burden on the worker is reduced compared to when a question is displayed, the worker reads the question, and then the worker answers.
  • the worker's biometric information, location information, relationship information, and computer system usage history as a feature, it is possible to estimate engagement more objectively than when engagement is estimated based solely on responses to questions. Furthermore, by using at least one of the worker's biometric information, location information, relationship information, and computer system usage history, it is possible to reduce the number of questions other than those asking about the worker's mood, and to estimate engagement without asking any questions other than those asking about the worker's mood. This reduces the burden on people (workers, etc.) who answer questions.
  • the engagement estimation method of this embodiment is executed by the engagement estimation system 1, and is a method for estimating a worker's engagement with work.
  • the engagement estimation method has a first step, a second step, a third step, a fourth step, and a fifth step.
  • a questioning step is performed at multiple time points.
  • the questioning step multiple images corresponding one-to-one to multiple moods are displayed, and the worker is asked to select an image that corresponds to the worker's mood from the multiple images.
  • answer information regarding which image was selected from the multiple images at each of the multiple time points is stored.
  • multiple feature amounts are determined.
  • a regression equation is obtained that represents the relationship between the multiple feature amounts in the first period and the worker's engagement in the first period that is determined in advance.
  • the worker's engagement in the second period is estimated based on the multiple feature amounts in a second period different from the first period and the regression equation.
  • the multiple feature amounts include at least one of the following: a feature amount based on the worker's biometric information measured by the biometric information measuring terminal 3; a feature amount based on the worker's location information at the worker's workplace; a feature amount based on relationship information regarding the relationship between the worker and other workers at the workplace; and a feature amount based on the usage history of a computer system (PC8) used by the worker for work.
  • the multiple feature amounts further include a feature amount based on response information.
  • the engagement estimation method can also be embodied in a program.
  • the program of this embodiment is a program for causing one or more processors of a computer system to execute the engagement estimation method.
  • the program may be recorded on a non-transitory recording medium that can be read by the computer system.
  • the engagement estimation system 1 estimates the engagement of each of the multiple workers.
  • the engagement estimation system 1 is used together with, for example, a bio-information measuring terminal 3, a position measuring system 4, a data server 5, an operation terminal 6, an information processing server 7, a PC (personal computer) 8, an attendance management system 9, and an exercise measuring terminal 10.
  • the biological information measurement terminal 3 measures biological information of each of the multiple workers.
  • the biological information includes, for example, at least one of heart rate, blood pressure, skin temperature, sweat rate, and voice information.
  • a single biological information measurement terminal 3 may measure multiple types of biological information (for example, heart rate and blood pressure).
  • the bioinformation measuring terminal 3 is, for example, a wearable terminal worn by a worker.
  • the wearable terminal is equipped with, for example, an optical heart rate sensor, which measures the worker's heart rate and blood pressure.
  • the wearable terminal is also equipped with, for example, a temperature sensor, which measures the worker's skin temperature.
  • the wearable terminal is also equipped with, for example, a sweat sensor, which measures the amount of sweat produced by the worker.
  • the bio-information measuring terminal 3 captures an image of a worker for a certain period of time using a camera (such as a near-infrared camera) to generate image data, and measures the worker's heart rate based on the image data.
  • a camera such as a near-infrared camera
  • the bio-information measuring terminal 3 is, for example, a blood pressure monitor with an arm band, and measures the worker's blood pressure while the arm band is wrapped around the worker's arm.
  • the bioinformation measuring terminal 3 is provided with a microphone, for example, and converts the voice of the worker into audio information in the form of an electrical signal using the microphone.
  • the microphone may be provided in a wearable terminal.
  • the position measurement system 4 measures position information of each of the multiple workers.
  • the position information includes, for example, coordinate information of each of the multiple workers.
  • each of the multiple workers carries a mobile terminal such as a smartphone or a wearable terminal.
  • Multiple beacon devices are installed in the workplaces of the multiple workers (e.g., office buildings, stores, or factories).
  • Each of the multiple beacon devices transmits a beacon signal.
  • a mobile device carried by the worker measures the received signal strength of the beacon signal.
  • Information on the received signal strength is transmitted from the mobile device to the position measurement system 4.
  • the position measurement system 4 calculates the distance between the mobile device and each of the multiple beacon devices based on the received signal strength.
  • the position measurement system 4 measures the position information of the mobile device by three-point positioning based on the distance between the mobile device and each of the multiple beacon devices and the position information of each of the multiple beacon devices.
  • the position measurement system 4 transmits the position information of the mobile device to the engagement estimation system 1 as position information of the worker carrying the mobile device.
  • the mobile terminal e.g., a wearable terminal
  • receives the beacon signal may also serve as the biological information measurement terminal 3.
  • the data server 5 stores relationship information.
  • the relationship information is information about the relationships between the multiple workers in their workplaces (such as a company, a government office, or an organization). More specifically, the relationship information includes, for example, information about the hierarchical relationships between the multiple workers.
  • the information about the hierarchical relationships between the multiple workers is, for example, information about the job titles of each of the multiple workers.
  • the job title refers to a work position.
  • the job title is a position or a rank.
  • the relationship information also includes, for example, organizational information about the organization (department or division, etc.) to which each of the multiple workers belongs. The department or division is distinguished by the name, for example, XX Department, XX Section, or XX Center.
  • the relationship information also includes, for example, business information (for example, information about the name of the business) related to the identification of the business (project, etc.) in which each of the multiple workers is involved.
  • the data server 5 also stores area information.
  • the area information includes, for example, map information of the workplaces of multiple workers.
  • the area information includes, for example, information on the location of each room and the purpose of each room.
  • the operation terminal 6 is, for example, a personal computer or a mobile terminal.
  • the mobile terminal is, for example, a mobile phone such as a smartphone, a wearable terminal, or a tablet terminal.
  • the operation terminal 6 generates the worker's work-related reporting information in response to human operation.
  • the worker himself/herself may operate the operation terminal 6, or another person may operate the operation terminal 6.
  • the operation terminal 6 includes, for example, a touch panel display, and displays survey items on the touch panel display. A person answers the survey items by operating the touch panel display. The operation terminal 6 then generates declaration information that includes the answers obtained from the person.
  • a person selects an answer from among multiple options.
  • the multiple options are, for example, five options: “Agree,” “Somewhat agree,” “Can't say,” “Somewhat disagree,” and “Disagree.”
  • the questionnaire items include, for example, questions for estimating engagement in the first period described above in the information processing server 7. Such questions are hereinafter referred to as “first questions”, and answers to the first questions are hereinafter referred to as “first answers”.
  • the first questions are, for example, questions asking about the worker's place of employment, the content of the work, and the worker's thoughts and feelings about their colleagues.
  • the questionnaire items also include, for example, questions for estimating engagement in a second period different from the first period by the engagement estimation system 1. Such questions are hereinafter referred to as “second questions,” and answers to the second questions are hereinafter referred to as “second answers.” At least one second question may be the same as the first question. It is preferable that the number of items (number of questions) of the second questions is less than the number of items (number of questions) of the first questions.
  • the information processing server 7 estimates the engagement of the worker in a first period. More specifically, the information processing server 7 first acquires declaration information from the operation terminal 6. The declaration information includes at least one first response. Based on the at least one first response, the information processing server 7 estimates the engagement of the worker in the first period.
  • Engagement is expressed, for example, as a numerical value.
  • a known method such as that disclosed in Patent Document 1 can be adopted as a method for the information processing server 7 to estimate engagement. For example, a score for the first answer is determined depending on which of multiple options is selected as the first answer. The information processing server 7 estimates the sum of the scores for each of the multiple first answers as the worker's engagement in the first period.
  • PCs personal computers 8 for work. More specifically, each of the workers is assigned one or more PCs 8 by his/her workplace. Each of the workers uses the one or more PCs 8 assigned to him/her.
  • PC8 is equipped with a memory unit 84 as a storage device that stores the usage history of the PC8.
  • the memory unit 84 is a hard disk drive (HDD) or a solid state drive (SSD) or the like.
  • Software for acquiring the usage history of PC8 and storing it in the memory unit 84 may be installed in PC8.
  • the PC 8 also includes a processing unit 81, a display unit 82, and an operation unit 83.
  • the processing unit 81 includes a computer system having one or more processors and a memory.
  • the functions of the processing unit 81 are realized by the processor of the computer system executing a program recorded in the memory of the computer system.
  • the program may be recorded in the memory, or may be provided via a telecommunications line such as the Internet, or may be recorded on a non-transitory recording medium such as a memory card and provided.
  • the display unit 82 has a display for displaying information.
  • the operation unit 83 accepts operations by a person.
  • the operation unit 83 has at least one of, for example, a mouse, a keyboard, a button, and a touch panel.
  • the processing unit 81 controls the display unit 82, the operation unit 83, and the memory unit 84. At a predetermined time, the processing unit 81 causes the display unit 82 to display an operation window 820 (see FIG. 4) that displays a question for prompting the worker to select an image from a plurality of images that fits the worker's mood. The worker can answer the question by operating the operation unit 83 to select one of the images. Answer information regarding which image from the plurality of images was selected is stored in the memory unit 84.
  • PC8 is pre-installed with software for executing a question step that displays an operation window 820 on the display unit 82 and prompts the worker to select one of the images.
  • the software may be provided to PC8 from engagement estimation system 1 via wired or wireless communication, or may be provided to PC8 from another device.
  • the processing unit 81 performs the questioning step at multiple time points. More specifically, the processing unit 81 performs the questioning step at multiple time points in the first period and at multiple time points in the second period.
  • the multiple time points are two time points, morning and evening. That is, the processing unit 81 performs the questioning step in the morning and evening of the first period, and in the morning and evening of the second period.
  • the multiple time points are the time when the worker arrives at work and the time immediately before the worker leaves work.
  • the questioning step is performed at multiple time periods in one day.
  • the operation window 820 is automatically displayed as a pop-up during specified time periods in the morning and evening.
  • a worker can cause the processing unit 81 to execute processing by moving the mouse cursor and clicking a specified area of the operation window 820.
  • each of the multiple images displayed in the first step is an image showing a person's facial expression. That is, as shown in FIG. 4, multiple images showing people's facial expressions are displayed in the operation window 820. More specifically, each of the multiple images is an illustration. In FIG. 4, displayed from left to right are an image showing a very good mood (very good), an image showing a good mood (good), an image showing a bad mood (bad), and an image showing a very bad mood (very bad). In addition, the mood that each image represents is displayed in text near (below) each image.
  • the worker clicks on one of the images.
  • the processing unit 81 determines that the clicked image has been selected and generates answer information.
  • the answer information is represented by a number: 1, 2, 3, or 4. That is, when an image that represents a very good mood is selected, the answer information is 1. When an image that represents a good mood is selected, the answer information is 2. When an image that represents a bad mood is selected, the answer information is 3. When an image that represents a very bad mood is selected, the answer information is 4.
  • the operation window 820 displays a minimize button 821, a maximize button 822, and a close button 823.
  • the minimize button 821 is clicked, the operation window 820 is hidden, and when an icon displayed on the task bar is then clicked, the operation window 820 is displayed again.
  • the maximize button 822 is clicked, the operation window 820 is displayed over the entire screen of the display unit 82.
  • the close button 823 is clicked, the software for executing the question step is terminated and the operation window 820 is hidden, and when an operation is then performed to restart the software for executing the question step, the operation window 820 is displayed again.
  • the processing unit 81 issues a notification urging the selection of an image.
  • the notification is issued, for example, by displaying a message prompting the selection as a pop-up on the display unit 82.
  • the engagement estimation method of this embodiment has a notification step of issuing a notification prompting the selection if an image that matches the worker's mood is not selected in the first step.
  • the processing unit 81 issues a notification prompting the selection again when a certain time has passed since issuing the notification prompting the selection.
  • a notification prompting the selection is issued at certain time intervals until a selection is made.
  • PC8 is an example of a computer system that a worker uses for work.
  • a computer system includes one or more computers.
  • the computer system that a worker uses for work is not limited to PC8, and may be, for example, a mobile phone such as a smartphone, a tablet terminal, or a host computer.
  • the computer system that a worker uses for work may be, for example, a computer system for operating an object to be operated, such as a vehicle or a machine tool, and the usage history may include the operation history of the object to be operated.
  • the PC 8 is also an example of a worker terminal that performs the above-mentioned questioning steps at multiple points in time.
  • the worker terminal is not limited to the PC 8, and may be, for example, a mobile phone such as a smartphone, or a tablet terminal.
  • a questionnaire (question) is presented on the operation terminal 6, and the worker or another person answers the questionnaire.
  • declaration information is generated on the operation terminal 6.
  • a question is presented on the PC 8 to prompt the worker to select an image that matches the worker's mood from among multiple images, and the worker himself answers the question.
  • answer information representing the worker's mood is generated on the PC 8. That is, in this embodiment, questions are presented on both the operation terminal 6 and the PC 8.
  • the PC 8 may also function as the operation terminal 6. That is, the questions presented on the operation terminal 6 in this embodiment may be presented by the PC 8. Alternatively, the questions presented on the PC 8 in this embodiment may be presented by the operation terminal 6.
  • the attendance management system 9 generates attendance information for each of a plurality of workers.
  • each of the plurality of workers carries a readable device such as a mobile terminal (such as a smartphone or wearable terminal) or an IC card, and holds the readable device over the reading device of the attendance management system 9 when arriving at and leaving work.
  • the attendance management system 9 then reads the identification information stored in the readable device from the readable device.
  • the attendance management system 9 generates attendance information for each of a plurality of workers, and the attendance information includes information on the arrival time and the leaving time.
  • the exercise measurement terminal 10 obtains an exercise index of the worker.
  • the exercise index indicates at least one of the quality and quantity of exercise.
  • the exercise index includes, for example, at least one of the amount of activity and the amount of movement.
  • the amount of activity is expressed, for example, in METs (Metabolic equivalents).
  • the amount of movement is, for example, the number of steps.
  • the exercise measuring terminal 10 is, for example, a wearable terminal.
  • the worker carries the wearable terminal.
  • the wearable terminal is equipped with, for example, a pedometer and measures the number of steps taken by the worker.
  • the wearable terminal also measures, for example, the worker's biometric information (heart rate, blood pressure, skin temperature, amount of sweat, etc.) as described above.
  • the wearable terminal calculates the worker's activity level based on the biometric information.
  • the engagement estimation system 1 includes a processing unit 2, a memory unit 11, and a communication unit 12.
  • the storage unit 11 is a storage device configured with a hard disk drive (HDD) or a solid state drive (SSD) or the like.
  • the storage unit 11 stores information.
  • the storage unit 11 stores information acquired from an external device, such as biometric information acquired from the biometric information measuring terminal 3, location information acquired from the location measuring system 4, relationship information acquired from the data server 5, and usage history and response information acquired from the PC 8.
  • the communication unit 12 includes a communication interface device.
  • the communication unit 12 is capable of communicating with external devices (e.g., the bioinformation measurement terminal 3, the position measurement system 4, the data server 5, and the PC 8) via the communication interface device.
  • external devices e.g., the bioinformation measurement terminal 3, the position measurement system 4, the data server 5, and the PC 8)
  • “capable of communication” means that signals can be sent and received directly or indirectly via a network or a repeater, etc., using an appropriate communication method such as wired communication or wireless communication.
  • the processing unit 2 includes a computer system having one or more processors and a memory.
  • the functions of the processing unit 2 are realized by the processor of the computer system executing a program recorded in the memory of the computer system.
  • the program may be recorded in the memory, or may be provided via a telecommunications line such as the Internet, or may be recorded on a non-transitory recording medium such as a memory card and provided.
  • the processing unit 2 has a first acquisition unit 21, a second acquisition unit 22, a feature determination unit 23, a regression unit 24, an estimation unit 25, a presentation content generation unit 26, and a communication processing unit 27. Note that these merely indicate the functions realized by the processing unit 2, and do not necessarily indicate a concrete configuration.
  • the first acquisition unit 21 acquires worker information (information on each of a plurality of workers) via the communication unit 12.
  • the worker information includes the worker's bioinformation measured by the bioinformation measuring terminal 3, the worker's position information measured by the position measuring system 4, and the relationship information stored in the data server 5.
  • the worker information further includes reporting information generated by the operation terminal 6.
  • the reporting information is information about the work and is generated in response to operations on the operation terminal 6.
  • the reporting information is, for example, one or both of the first and second responses described above.
  • the worker information also includes the usage history of the computer system that the worker uses for work.
  • the computer system is PC8. That is, the worker information includes the usage history of PC8.
  • the worker information also includes answer information that indicates the worker's mood.
  • the answer information is generated by the PC 8 as the worker operates the PC 8.
  • the worker information further includes the worker's attendance information.
  • the worker's attendance information is generated by the attendance management system 9.
  • the worker information further includes the worker's exercise index.
  • the worker's exercise index is generated by the exercise measurement terminal 10.
  • the second acquisition unit 22 acquires the engagement of a worker in a first period estimated by the information processing server 7.
  • the feature determining unit 23 determines a plurality of features from the worker information based on the relationship between the worker information and the engagement in the first period. For example, the feature determining unit 23 extracts a plurality of parameters from the worker information.
  • the plurality of parameters are, for example, the duration of a conversation between the worker and a specific person, the amount of activity of the worker, and the overtime hours of the worker.
  • a certain parameter may coincide with certain information about a worker.
  • the information about the worker may be used as a parameter directly.
  • the overtime hours of a worker is an example of information about a worker.
  • the overtime hours of a worker may be used as a parameter.
  • a certain parameter may be found based on certain information about the worker. That is, the information about the worker may be processed to become the parameter.
  • the position information of the worker measured by the position measurement system 4 and the relationship information stored in the data server 5 are each an example of worker information.
  • the conversation time between the worker and a specific person may be found as a parameter. More specifically, the conversation time (face-to-face) between the worker and the supervisor can be found, for example, from the position information of the worker and the supervisor. That is, the time when the distance between the worker and the supervisor is within a specified distance (for example, one meter) can be determined as the conversation time between the worker and the supervisor. Also, whether or not a certain worker is a supervisor can be identified based on the relationship information.
  • the characteristic amount determining unit 23 may determine multiple characteristic amounts for each worker. In other words, to determine multiple characteristic amounts corresponding to a certain worker, information about the worker and the worker's engagement during the first period may be referenced.
  • the feature determination unit 23 may determine multiple features common to two or more workers. In other words, to determine multiple features corresponding to two or more workers, information on each of the two or more workers and the engagement of each of the two or more workers in the first period may be referenced.
  • the regression unit 24 acquires a regression equation that expresses the relationship between the multiple feature amounts in the first period and the worker engagement in the first period that was determined in advance based on the first question described above.
  • the process of "acquiring a regression equation” may be a process in which the regression unit 24 acquires a regression equation determined in an external configuration of the engagement estimation system 1 from the external configuration, or a process in which the regression unit 24 acquires a regression equation stored in the memory unit 11 of the engagement estimation system 1.
  • the process of "acquiring a regression equation” may be a process of determining a regression equation.
  • the regression unit 24 performs the process of "acquiring a regression equation” by performing a process of determining a regression equation based on the multiple feature amounts in the first period and the worker engagement in the first period that was determined in advance.
  • the process in which the feature determination unit 23 determines multiple feature quantities and the process in which the regression unit 24 determines a regression equation are performed in an integrated manner.
  • Each of the multiple feature quantities is a factor that determines engagement.
  • the multiple parameters are obtained based on worker information, etc.
  • the specific parameter may have a relatively strong correlation with engagement, a relatively weak correlation, or no correlation at all.
  • the feature determination unit 23 determines, among the multiple parameters, a parameter that has a strong correlation with the engagement in the first period as a feature.
  • the feature determination unit 23 calculates the strength of the correlation, for example, by multiple regression analysis. That is, the feature determination unit 23 (and the regression unit 24) obtains a multiple regression equation using the engagement in the first period as the objective variable and the multiple parameters in the first period as multiple explanatory variables, and further obtains a coefficient of determination of the multiple regression equation.
  • the coefficient of determination is a value between 0 and 1. The larger the coefficient of determination, the stronger the correlation.
  • the feature determination unit 23 determines the multiple feature quantities based on the coefficient of determination.
  • the feature determination unit 23 determines the multiple explanatory variables when the coefficient of determination is greater than a threshold value as the multiple feature quantities.
  • the feature determination unit 23 may obtain a simple regression equation instead of the multiple regression equation, and obtain the strength of correlation (coefficient of determination) from the simple regression equation.
  • the feature determination unit 23 may obtain the coefficient of determination using another machine learning model.
  • the feature determination unit 23 can determine the correlation between multiple features and engagement by regression analysis or the like.
  • Figure 2 illustrates an example of a simple regression equation (straight line L1) obtained with engagement in the first period as the dependent variable and one parameter in the first period as the explanatory variable.
  • the explanatory variable is the number of conversations with a second-level supervisor (the direct supervisor's further direct supervisor).
  • the dependent variable and explanatory variable are each obtained for multiple time periods.
  • pairs of the dependent variable and explanatory variable for each time period are plotted as points.
  • a simple regression equation is obtained based on these multiple points.
  • Figure 2 When determining multiple features corresponding to one specific worker, Figure 2 will be a plot of data (pairs of objective variables and explanatory variables) for that one specific worker. When determining multiple features common to two or more workers, Figure 2 will be a plot of data (pairs of objective variables and explanatory variables) for each of the two or more workers.
  • the feature determination unit 23 also determines, as the feature, at least one of the following four parameters: a parameter based on the worker's biometric information measured by the biometric measurement terminal 3, a parameter based on the worker's location information at the worker's workplace, a parameter based on relationship information regarding the relationship between the worker and other workers at the workplace, and a parameter based on the usage history of the computer system (PC8) used by the worker for work.
  • a parameter based on the worker's biometric information measured by the biometric measurement terminal 3 a parameter based on the worker's location information at the worker's workplace
  • a parameter based on relationship information regarding the relationship between the worker and other workers at the workplace a parameter based on the usage history of the computer system (PC8) used by the worker for work.
  • PC8 computer system
  • the feature determination unit 23 may determine multiple feature amounts based on the difference between the engagement at the first time point and the engagement at the second time point. In other words, the feature determination unit 23 may determine multiple feature amounts based on the amount of change in engagement. For example, the feature determination unit 23 may determine a multiple regression equation using multiple parameters as multiple explanatory variables and the amount of change in engagement as the objective variable, and may determine, from among the multiple parameters, a parameter that has a strong correlation with the amount of change in engagement as the feature amount.
  • the feature determination unit 23 may set the explanatory variable as the difference between a predetermined parameter and a reference value.
  • the reference value may be, for example, the average value of the predetermined parameter in a specific period.
  • the predetermined parameter is set to the activity level of the worker in a first period.
  • the reference value may be, for example, the average value of the activity level of the worker in the same period of the previous year to the first period.
  • the reference value may be, for example, the average value of the activity level of the worker from a predetermined number of days (for example, six months) before the first period to the first period.
  • the multiple feature amounts include at least a feature amount based on answer information that represents the mood of the worker.
  • the answer information is information regarding which of the multiple images was selected at each of the multiple time points (morning and evening). As mentioned above, the answer information is expressed as a number: 1, 2, 3, or 4.
  • Figure 5 shows an example of response information for the morning and evening of each day.
  • an image representing a good mood (good) was selected, so the response information is 2.
  • an image representing a bad mood (bad) was selected, so the response information is 3.
  • the third step includes a step of determining the amount of change in the worker's mood between multiple time points. More specifically, in the third step, the feature quantity determination unit 23 determines the amount of change in mood from morning to evening for each day. The amount of change is the value obtained by subtracting the value of the answer information in the morning of the same day from the value of the answer information in the evening. If the worker's mood does not change from morning to evening, the amount of change is 0. If the worker's mood improves from morning to evening, the amount of change is a positive value. If the worker's mood deteriorates from morning to evening, the amount of change is a negative value.
  • the feature based on the response information includes the variability in the amount of change in the worker's mood.
  • the variability is the standard deviation ⁇ calculated from the amount of change in the worker's mood over a certain period (here, n days).
  • n days a certain period
  • the standard deviation ⁇ is calculated using the following formula.
  • the standard deviation ⁇ (variation) calculated in this manner is a feature based on the response information.
  • the variation as a feature may be variance instead of the standard deviation ⁇ .
  • the correlation between variability and engagement is expressed by a regression equation as described above.
  • a regression equation is calculated for each individual worker.
  • the greater the variability the lower the engagement.
  • the rate of change in engagement relative to a change in variability may be different compared to the first worker, or the greater the variability, the higher the engagement.
  • the estimation unit 25 executes the fifth step (estimation step). That is, the estimation unit 25 estimates the engagement of the worker in a second period (the most recent period, as an example here) different from the first period, based on the multiple feature amounts determined by the feature amount determination unit 23 and the regression equation obtained by the regression unit 24. For example, it is assumed that three parameters, namely, the conversation time between the worker and the second-level supervisor, the overtime hours of the worker, and the variance in the amount of change in the worker's mood, are determined as feature amounts by the feature amount determination unit 23.
  • the estimation unit 25 obtains the most recent engagement reflecting the parameters of the previous month by substituting the above three parameters in the previous month into a multiple regression equation (obtained by the regression unit 24) in which each of the above three parameters is an explanatory variable and the engagement is an objective variable.
  • the engagement may be calculated, for example, by substituting the average value of the parameters at the multiple time points into the multiple regression equation.
  • multiple engagements may be calculated by substituting the parameters at the multiple time points into the multiple regression equation.
  • the presentation content generation unit 26 executes a presentation content generation step. That is, the presentation content generation unit 26 generates content to be presented to the viewer based on the estimation result in the fifth step (estimation step) by the estimation unit 25.
  • the content to be presented to the viewer includes, for example, a numerical value representing the engagement obtained in the fifth step.
  • the viewer may be the worker himself/herself who is the target of the engagement request, or may be another person (for example, the worker's supervisor).
  • the presentation content generation unit 26 generates, for example, a list of the engagement of each of a number of workers for each survey period (e.g., each month) as content to be presented to the viewer.
  • the presentation content generation unit 26 also generates, for example, a list of a worker's engagement for each survey period as content to be presented to the viewer.
  • the communication processing unit 27 controls the transmission and reception of information by the communication unit 12.
  • the communication processing unit 27 executes the transmission step by controlling the communication unit 12.
  • the transmission step is a step of transmitting the content generated in the presentation content generation step to a terminal.
  • the terminal is, for example, a PC 8.
  • the terminal has a display that displays the received content. The viewer views the content generated in the presentation content generation step via the display of the terminal.
  • the content generated in the presentation content generation step is transmitted to the terminal at regular intervals.
  • the regular intervals are, for example, one week, two weeks, one month, or two months.
  • Worker information including biometric information, location information, and relationship information, is collected periodically or irregularly through measurements and questionnaires given to workers.
  • the worker information is then compiled on a monthly basis. For example, in Figure 3, worker information for January, worker information for February, etc. are compiled.
  • the information processing server 7 also estimates engagement based on the above-mentioned first response included in the declaration information.
  • a survey is conducted on workers once a month, and a first response is obtained as a response to the survey. Then, the information processing server 7 estimates engagement for each month based on the first response. For example, in FIG. 3, the information processing server 7 estimates engagement for January based on the first response in January, estimates engagement for February based on the first response in February, and estimates engagement for March based on the first response in March.
  • the engagement estimation system 1 starts estimating engagement.
  • the engagement data estimated by the information processing server 7 is referred to as "reference engagement data.”
  • the first period described above corresponds to January to March before the engagement estimation system 1 starts estimating engagement.
  • the engagement estimation system 1 first determines a multiple regression equation and feature quantities. For example, when estimating engagement for April (second period), the engagement estimation system 1 refers to worker information and reference engagement data for a time period (first period) other than April (second period). In FIG. 3, the engagement estimation system 1 refers to worker information and reference engagement data for January to March. In this way, the engagement estimation system 1 determines a multiple regression equation and feature quantities. In more detail, the engagement estimation system 1 determines a multiple regression equation using engagement for January to March (first period) as the objective variable and multiple parameters extracted from worker information for January to March (first period) as multiple explanatory variables, and determines the multiple explanatory variables when the coefficient of determination of the multiple regression equation is greater than a threshold value as multiple feature quantities.
  • the engagement estimation system 1 estimates engagement for April (second period) from the multiple regression equation and the feature quantities. More specifically, the engagement estimation system 1 finds engagement for April (second period) by substituting the feature quantities obtained from the worker information for April (second period) into the multiple regression equation.
  • the engagement estimation system 1 generates content to be presented to the viewer based on the determined engagement, and transmits the content to the terminal (PC 8). For example, the engagement estimation system 1 generates content to be presented once a month, and transmits the content to the terminal.
  • the process of determining the multiple regression equation and feature quantities does not need to be executed every time the engagement estimation system 1 estimates engagement, but only needs to be executed the first time (i.e., when estimating engagement in April).
  • the process of determining (updating) the multiple regression equation and feature quantities may be executed every time a period longer than the engagement estimation interval (one month) by the engagement estimation system 1 elapses (e.g., every six months).
  • the burden is reduced, for example, because the frequency of conducting surveys to obtain a first response can be reduced.
  • the engagement estimation system 1 In order for the engagement estimation system 1 to estimate engagement for a second period (e.g., April), it is not essential for the engagement estimation system 1 to collect information from the worker's information for the second period other than the multiple feature quantities determined by the feature determination unit 23. For example, assume that the conversation time between the worker and the second-level supervisor, the worker's overtime hours, and the variability in the amount of change in the worker's mood are determined as multiple feature quantities by the feature determination unit 23. In this case, it is not essential for the engagement estimation system 1 to collect information from the worker's information for the second period other than the multiple feature quantities determined by the feature determination unit 23 (e.g., the worker's activity level).
  • the engagement estimation system 1 may determine a multiple regression equation and multiple feature quantities by referring to worker information and reference engagement data for a period later than April (second period) (e.g., May to June).
  • the first step is a step of executing a questioning step at multiple points in time (morning and evening of each day) in which an operation window 820 is displayed on the display unit 82 of the PC 8 and the worker is prompted to select one of the images.
  • the PC 8 determines whether the current time is a specific time in the morning or evening (step ST1).
  • the PC 8 displays a question on the display unit 82 (step ST2). More specifically, the PC 8 displays an operation window 820 on the display unit 82, which presents a question for prompting the worker to select an image that matches the worker's mood from among multiple images.
  • the worker answers the question by selecting one of the images.
  • the processing unit 81 stores the answer time and the answer content in the memory unit 84 (step ST4).
  • the answer content is, in other words, a numerical value of 1, 2, 3, or 4 as answer information.
  • the processing unit 81 then transmits the answer time and the answer content to the engagement estimation system 1 (step ST5).
  • step ST6 If no reply is made within a certain period of time after the operation window 820 is displayed on the display unit 82 (step ST3: No), the processing unit 81 issues a notification prompting the reply (step ST6). Until a reply is made, the processing unit 81 issues a notification prompting the reply at regular intervals.
  • FIG. 7 merely shows one example of the flow of processing performed by PC 8, and the order of processing may be changed as appropriate, and processing may be added or omitted as appropriate.
  • Feature A feature may be, for example, related to "job resources,””personalresources,” or “job demands” defined in the "job demands-resources model,” i.e., the "JD-R model.”
  • the feature may be related to empathy or a sense of satisfaction with at least one of the vision, mission, and philosophy of the group (company, etc.) to which the worker belongs for work.
  • the feature may be related to two or more of the above.
  • job resources refer to at least one of "support from those around you,” “relationships with those around you,” “job autonomy,” “coaching from colleagues,” “feedback from colleagues,” “diversity of relationships,” and “opportunities for career development.”
  • personal resources may relate to at least one of "optimism,” “resilience,” and “recovery status.”
  • job demands refers to at least one of the following: “quantitative workload,” “qualitative workload,” and “physical workload at work.”
  • the feature determination unit 23 extracts multiple parameters from, for example, the worker's information and determines multiple feature amounts from them.
  • the parameters (feature amounts) are, for example, the conversation time between the worker and a specific person, the worker's activity level, and the worker's overtime hours.
  • the multiple feature amounts determined by the feature determination unit 23 include at least one of the following: a feature amount based on the worker's biometric information measured by the biometric information measurement terminal 3, a feature amount based on the worker's location information at the worker's workplace, a feature amount based on relationship information regarding the relationship between the worker and another worker at the workplace, and a feature amount based on the usage history of the computer system (PC 8) used by the worker for work.
  • the multiple feature amounts determined by the feature determination unit 23 further include a feature amount based on response information.
  • the conversation time between a worker and a specific person as a feature is related to the "work resources" of "support from the surroundings" and "relationships with the surroundings".
  • the (face-to-face) conversation time between a worker and a specific person can be obtained, for example, from the position information and voice information of each of a plurality of workers.
  • the conversation time between a worker and a specific person can be determined as the time when the distance between the worker and the specific person is within a predetermined distance (for example, 1 meter) and voice information is output from a microphone present around the conversation location.
  • the microphone may be, for example, carried by the worker or installed near the worker's work location.
  • the duration of a (face-to-face) conversation between a worker and a specific person can also be calculated, for example, from only the location information of each of a number of workers.
  • the duration of a conversation can be calculated by conveniently determining that the worker and a specific person are in close proximity to each other as being for the purpose of conversation.
  • the duration of a conversation between a worker and a specific person can be determined as the time when the distance between them is within a specified distance.
  • conversation time between the worker and a specific person is not limited to face-to-face conversation time, but may also include non-face-to-face (e.g., online) conversation time.
  • Online conversation time is extracted, for example, from the usage history of PC8.
  • the amount of time a worker has face-to-face conversations with a specific person and the amount of time he or she has not face-to-face conversations may be required separately.
  • the conversation time between the worker and the specific person may be extracted from the declaration information input to the operation terminal 6.
  • the engagement estimation system 1 may obtain the conversation time between the worker and the specific person based on the declaration from the respondent (e.g., the worker).
  • the relationship information is information about the relationships between multiple workers in the workplace of the multiple workers.
  • the feature determination unit 23 may obtain the conversation time of multiple workers for each relationship based on the relationship information. That is, the feature determination unit 23 may obtain the conversation time between a worker and a worker in a specific position. For example, the feature determination unit 23 may obtain the conversation time between a worker and a superior, the conversation time between a worker and a subordinate, and the conversation time between workers of the same position.
  • the relationships may also be further subdivided. For example, the feature determination unit 23 may obtain the conversation time between a worker and a first-level superior (immediate superior) and the conversation time between a worker and a second-level superior (the superior of the immediate superior).
  • the feature determination unit 23 may obtain the conversation time between workers who belong to the same organization (department, etc.) and between workers who belong to different organizations (departments, etc.). Also, for example, the feature determination unit 23 may determine the conversation time between workers who are in charge of the same work (project, etc.).
  • the number of conversations between a worker and a specific person may be obtained as a feature.
  • the number of face-to-face conversations can be obtained, for example, from the position information of each of a plurality of workers.
  • the number of times the distance between the worker and a specific person changes from more than a specified distance (for example, one meter) to within the specified distance can be regarded as the number of conversations between the worker and the specific person.
  • the feature quantities, which are the worker's overtime hours, the number of overtime hours, the number of night shift hours, and the number of holiday shift hours, are related to the "work discretion" of the "work resources.”
  • the worker's overtime hours, the number of overtime hours, the number of night shift hours, and the number of holiday shift hours are extracted, for example, from the attendance information output from the attendance management system 9, or the reported information entered into the operation terminal 6.
  • the worker's overtime hours, the number of overtime hours, the number of night shift hours, and the number of holiday shift hours are determined, for example, from the start and end times of PC 8, which are extracted from the usage history of PC 8.
  • the mood at the end of the workday is related to the "work resources" of "coaching from colleagues” and “feedback from colleagues.”
  • the mood at the end of the workday is extracted, for example, from the reporting information entered into the operation terminal 6.
  • the number of communications within a department and the number of people with whom communication takes place within the department, as features, are related to the "diversity of human relationships" of "work resources.”
  • the number of face-to-face communications within a department and the number of people with whom communication takes place within the department can be determined, for example, from the location information of each of multiple workers, or from location information and voice information, or can be extracted from reporting information entered into the operation terminal 6.
  • the number of non-face-to-face (e.g., online) communications within a department and the number of people with whom communication takes place within the department can be extracted, for example, from the usage history of PC 8 or from reporting information entered into the operation terminal 6.
  • the number of spaces used and the number of times specialized tools were used, as features, are related to the "opportunities for career development" of the "work resources.”
  • the number of spaces used and the number of times specialized tools were used can be obtained, for example, from the location information of the worker, or extracted from the usage history of the PC 8 or the reporting information entered into the operation terminal 6.
  • the feature quantities of break time, time when no data was entered into the PC, and the number of times when no data was entered into the PC are related to the "resilience" of "individual resources.”
  • the break time, time when no data was entered into the PC, and the number of times when no data was entered into the PC are extracted, for example, from the usage history of the PC 8 or the reported information entered into the operation terminal 6.
  • the amount of movement within the office is related to the "resilience" of "personal resources.”
  • the amount of movement within the office can be calculated, for example, from the location information of the worker, or measured by a pedometer (movement measuring terminal 10) carried by the worker.
  • the PC operation time and the number of times the PC is operated before the start of work, after the end of work, during late night hours, and on holidays, as feature quantities, relate to the "recovery status" of "personal resources."
  • the PC operation time and the number of times the PC is operated before the start of work, after the end of work, during late night hours, and on holidays are extracted, for example, from the usage history of the PC 8, the attendance information output from the attendance management system 9, or the reporting information input to the operation terminal 6.
  • the work time interval (the length of time between the end of work on a certain day and the start of work on the following day) as a feature quantity is related to the "recovery status" of the "personal resources.”
  • the work time interval is extracted, for example, from the usage history of the PC 8, the attendance information output from the attendance management system 9, or the reporting information input to the operation terminal 6.
  • the volume of the worker's voice is related to the "recovery status" of the "personal resources.”
  • the volume of the worker's voice is measured, for example, by a microphone provided in the bio-information measurement terminal 3.
  • the feature quantity ie, the working time (the length of time from the time of entering the workplace to the time of leaving the workplace), is related to the "quantitative burden of work" of the "work demands.”
  • the working time can be obtained, for example, from the location information of the worker or the attendance information output from the attendance management system 9.
  • the amount of time spent using a PC after work hours is related to the "quantitative burden of work" of the "demands of work.”
  • the amount of time spent using a PC after work hours is extracted, for example, from the usage history of PC8.
  • the feature quantities relate to the "quantitative workload" of the "job demands."
  • the time spent at a rest area and the number of times the rest area is used can be determined, for example, from a combination of the worker's location information and area information related to the rest area, etc.
  • the area information is obtained, for example, from the data server 5.
  • the feature quantities ie, the time period during which there is no PC input between the start and end of work and the number of times (the number of times that no input continues for a specified period of time or more), relate to the "quantitative workload" of the "demand level of work.”
  • the time period during which there is no PC input between the start and end of work and the number of times are extracted, for example, from the usage history of PC8.
  • the amount of conversation in the workplace is related to the "demands of work” and the "quantitative burden of work.”
  • the amount of conversation in the workplace can be obtained, for example, from the output (audio information) of the microphone equipped in the bioinformation measurement terminal 3.
  • the number of keyboard operations per unit time and the amount of mouse cursor movement per unit time which are characteristic quantities, are related to the "quality of work burden" of the "job demands."
  • the number of keyboard operations per unit time and the amount of mouse cursor movement per unit time are extracted, for example, from the usage history of the PC 8.
  • the PC operation time is related to the "quality of work burden" of the "job demands.”
  • the PC operation time is extracted, for example, from the usage history of PC8.
  • the task time with a high qualitative burden is related to the "qualitative burden of work" of the "degree of work demands.”
  • the task time with a high qualitative burden can be determined, for example, from bioinformation measured by the bioinformation measurement terminal 3.
  • the engagement estimation system 1 assumes that a state in which the heart rate, as bioinformation, is higher than a corresponding threshold value is a state of high qualitative burden, and determines the accumulated time in the state of high qualitative burden as the task time with a high qualitative burden.
  • the amount of activity of a worker is related to the "physical burden of work" of the "job demands.”
  • the amount of activity of a worker is expressed, for example, in METs.
  • the amount of activity of a worker can be calculated, for example, from bio-information (heart rate, blood pressure, skin temperature, amount of sweat, etc.) measured by the bio-information measuring terminal 3.
  • the number of steps taken by a worker is related to the "physical burden of work" of the "job demands."
  • the number of steps taken by a worker can be determined, for example, from the worker's location information.
  • the number of steps taken by a worker can be measured, for example, by a pedometer (exercise measuring terminal 10) carried by the worker.
  • the maximum heart rate of a worker is related to the "physical burden of work" of the "job demands."
  • the maximum heart rate of a worker is extracted, for example, from heart rate measurement data as biometric information.
  • the feature based on the response information includes a change amount of the worker's mood between multiple points in time, i.e., one of the multiple feature amounts is the change amount.
  • the worker answers the questions during the first period, and the amount of change during the second period is obtained. Also, during the first period, the remaining feature amounts among the multiple feature amounts during the first period are obtained by the bioinformation measuring terminal 3, the position measuring system 4, the data server 5, etc.
  • the worker answers the questions during the second time period, thereby obtaining the amount of change during the second time period. Also, during the second time period, the remaining feature quantities among the multiple feature quantities during the second time period are obtained in the bioinformation measurement terminal 3, the position measurement system 4, the data server 5, etc.
  • the regression unit 24 obtains a regression equation that represents the relationship between the multiple feature amounts and engagement in the first time period.
  • the estimation unit 25 estimates the worker's engagement in the second time period based on the multiple feature amounts in the second time period and the regression equation.
  • a regression equation is obtained for a first worker that shows the correlation between the amount of change in mood and engagement, the greater the amount of change, the lower the engagement.
  • the ratio of change in engagement to the amount of change may be different compared to the first worker, or the greater the amount of change, the higher the engagement.
  • the second period is one month (see FIG. 3).
  • the second period may be, for example, one day. That is, the amount of change in mood from morning to evening on a certain day may be used as a feature to estimate engagement on the same day.
  • the second period may also be a period shorter than one day (for example, a few hours).
  • the second period includes June 5th and June 6th
  • the amount of change in mood from morning to evening on June 5th and the amount of change in mood from morning to evening on June 6th can be obtained.
  • the average value of the multiple amounts of change can be used as the feature amount.
  • the feature quantity based on the response information is a parameter corresponding to the worker's mood at each of a plurality of time points.
  • the numerical values 1, 2, 3, or 4 as the response information in the morning and evening, respectively, are themselves feature quantities based on the response information.
  • the larger the numerical value showing the response information the lower the engagement.
  • the rate of change in engagement relative to the change in the numerical value showing the response information may be different compared to the case of the first worker, and, for example, the closer the numerical value showing the response information is to a specified numerical value between 1 and 4 (e.g., 2), the higher the engagement may be.
  • the second period is one month (see FIG. 3).
  • the second period may be, for example, one day, or may be a period shorter than one day (for example, a few hours).
  • the engagement estimation system 1 may be equipped with a display device that displays the information generated in the presentation content generation step.
  • the engagement estimation system 1 may include an operation unit that accepts operations for generating the declaration information, and may also serve as the operation terminal 6.
  • the engagement estimation system 1 may include a PC 8 or the like as a worker terminal that performs the above-mentioned questioning steps at multiple points in time.
  • the interrogation step is performed twice per day.
  • the interrogation step may be performed once per day, or three or more times per day.
  • the interrogation step may be performed, for example, every other day or every few days.
  • the amount of change in the worker's mood between multiple points in time is the amount of change in the worker's mood from a point in time within a day to a point in time on the same day.
  • the amount of change in the worker's mood between multiple points in time may be the amount of change in the worker's mood from one day to another day.
  • Each of the multiple images showing human facial expressions displayed in the operation window 820 is not limited to illustrations and may be, for example, photographs.
  • the moods represented by the multiple images displayed in the operation window 820 are not limited to very good, good, bad, or very bad.
  • the moods may be, for example, a relaxed mood, a tense mood, or a tired mood.
  • Each of the multiple images displayed in the operation window 820 is not limited to images showing a person's facial expression, and may be an image or symbol showing a person's mood.
  • the image or symbol showing a person's mood may be, for example, an image or symbol of sweat showing a tired mood, an image or symbol such as an eighth note or a heart mark showing a good mood, an image or symbol of an arrow pointing up to the right showing a good mood, or an image or symbol of an arrow pointing down to the right showing a depressed mood.
  • the multiple images displayed in the operation window 820 may be identical to each other, and two or more of the multiple images may be selectable.
  • the multiple images displayed in the operation window 820 may be multiple star marks. The more star marks selected, the better the mood.
  • the multiple images displayed in the operation window 820 may also be multiple weather marks.
  • the multiple weather marks may include, for example, sunny, cloudy, and rainy marks.
  • the sunny mark represents a better mood than the cloudy mark.
  • the cloudy mark represents a better mood than the rainy mark.
  • the multiple weather marks may also include, for example, a mark that combines a sunny mark and a cloudy mark. This mark represents a worse mood than the sunny mark, but a better mood than the cloudy mark.
  • the number of images displayed may be two, three, five or more.
  • the mood represented by each image is displayed in text near (below) each image in the operation window 820.
  • the PC 8 may execute a process of displaying a question for prompting the worker to select an image that matches the worker's mood from among a plurality of images, based on control by the engagement estimation system 1.
  • the engagement estimation system 1 may transmit a predetermined command signal to the PC 8 at a predetermined time period, and the PC 8 may display a question on the display unit 82 upon receiving the command signal.
  • the feature based on the response information is the variability in the amount of change in the worker's mood in the basic example, the amount of change in the worker's mood in variant 1, and the response information itself in variant 2. Two or three of these may be used as feature based on the response information to estimate engagement.
  • the feature determination unit 23 may refer to a correlation coefficient instead of the coefficient of determination to determine the strength of correlation between multiple parameters and engagement in the first period.
  • the entity that executes the engagement estimation system 1 or the engagement estimation method in the present disclosure includes a computer system.
  • the computer system is mainly composed of a processor and a memory as hardware. At least a part of the functions of the entity that executes the engagement estimation system 1 or the engagement estimation method in the present disclosure is realized by the processor executing a program recorded in the memory of the computer system.
  • the program may be pre-recorded in the memory of the computer system, may be provided through a telecommunications line, or may be provided by being recorded on a non-transitory recording medium such as a memory card, an optical disk, or a hard disk drive that is readable by the computer system.
  • the processor of the computer system is composed of one or more electronic circuits including a semiconductor integrated circuit (IC) or a large scale integrated circuit (LSI).
  • the integrated circuits such as ICs or LSIs referred to here are called different names depending on the degree of integration, and include integrated circuits called system LSIs, VLSIs (Very Large Scale Integration), or ULSIs (Ultra Large Scale Integration).
  • a field-programmable gate array (FPGA) that is programmed after the LSI is manufactured, or a logic device that allows the reconfiguration of the connection relationships within the LSI or the reconfiguration of the circuit partitions within the LSI, can also be used as a processor.
  • Multiple electronic circuits may be integrated into one chip, or may be distributed across multiple chips.
  • the computer system referred to here includes a microcontroller having one or more processors and one or more memories.
  • the microcontroller is also composed of one or more electronic circuits including a semiconductor integrated circuit or a large-scale integrated circuit.
  • the engagement estimation system 1 it is not essential for the engagement estimation system 1 that multiple functions are consolidated into one device, and multiple components of the engagement estimation system 1 may be distributed across multiple devices. Furthermore, at least some of the functions of the engagement estimation system 1 may be realized by a server or a cloud (cloud computing), etc.
  • multiple functions that are distributed across multiple devices may be consolidated into one device.
  • at least two of the data server 5, the information processing server 7, and the engagement estimation system 1 may be consolidated into one device.
  • the PC 8 may also function as the operation terminal 6.
  • the bio-information measuring terminal 3 may also function as the exercise measuring terminal 10.
  • At least a portion of the functions of the engagement estimation system 1 may be realized by a computational model generated by machine learning.
  • the third step of determining a plurality of feature quantities may be realized by the computational model.
  • the engagement estimation method is a method for estimating a worker's engagement with work, which is executed by an engagement estimation system (1).
  • the engagement estimation method has a first step, a second step, a third step, a fourth step, and a fifth step.
  • a questioning step is carried out at multiple time points.
  • the questioning step multiple images corresponding one-to-one to multiple moods are displayed, and the worker is asked to select an image that corresponds to the worker's mood from the multiple images.
  • answer information regarding which image was selected from the multiple images at each of the multiple time points is stored.
  • multiple feature amounts are determined.
  • a regression equation is obtained that represents the relationship between the multiple feature amounts in the first period and the worker's engagement in the first period that is determined in advance.
  • the worker's engagement in the second period is estimated based on the multiple feature amounts in a second period different from the first period and the regression equation.
  • the multiple feature amounts include at least one of the following: a feature amount based on the worker's biometric information measured by the biometric information measuring terminal (3), a feature amount based on the worker's location information at the worker's workplace, a feature amount based on relationship information regarding the relationship between the worker and another worker at the workplace, and a feature amount based on the usage history of a computer system (PC8) used by the worker for work.
  • the multiple feature amounts further include a feature amount based on response information.
  • the worker selects an image that matches the worker's mood from among multiple images, and the engagement estimation system (1) estimates the worker's engagement.
  • the worker can intuitively select an image that matches his or her current mood. This reduces the burden on the worker in answering questions. For example, the burden on the worker is reduced compared to when a question is displayed, the worker reads the question, and then the worker answers the question.
  • a regression equation is calculated based on a plurality of feature quantities in the first period and the engagement of the worker in the first period that has been calculated in advance.
  • the process of calculating the regression equation and the process of estimating worker engagement in the second time period can be consolidated into the engagement estimation system (1).
  • the engagement estimation method in the first or second aspect, further includes a presentation content generation step of generating content to be presented to the viewer based on the estimation result in the fifth step.
  • the engagement estimation method in the third aspect, further includes a transmission step of transmitting the content generated in the presentation content generation step to the terminal.
  • the questioning step is carried out at multiple time periods in one day.
  • the response information includes information regarding changes in the worker's mood throughout the day. This allows the engagement estimation system (1) to effectively estimate engagement.
  • the third step includes a step of determining the amount of change in the worker's mood between multiple points in time.
  • the feature based on the response information includes the variance in the amount of change.
  • the engagement estimation system (1) can effectively estimate engagement.
  • the third step includes a step of determining the amount of change in the worker's mood between multiple points in time.
  • the feature based on the response information includes the amount of change.
  • the engagement estimation system (1) can effectively estimate engagement.
  • the feature based on the response information is a parameter corresponding to the worker's mood at each of a plurality of points in time.
  • the engagement estimation system (1) can effectively estimate engagement.
  • the engagement estimation method in any one of the first to eighth aspects, further includes a notification step of issuing a notification to prompt a selection if an image that matches the worker's mood is not selected in the first step.
  • each of the multiple images displayed in the first step is an image showing a person's facial expression.
  • Configurations other than the first aspect are not essential to the engagement estimation method and may be omitted as appropriate.
  • the program according to the eleventh aspect is a program for causing one or more processors of a computer system (of the engagement estimation system 1) to execute the engagement estimation method according to any one of the first to tenth aspects.
  • an engagement estimation system (1) relating to a twelfth aspect estimates a worker's engagement with work.
  • the engagement estimation system (1) includes an acquisition unit (21), a memory unit (11), a feature determination unit (23), a regression unit (24), and an estimation unit (25).
  • the acquisition unit (21) acquires answer information from a worker terminal (PC8).
  • the worker terminal performs a questioning step at multiple time points. In the questioning step, multiple images corresponding one-to-one to multiple moods are displayed, and the worker is prompted to select an image from the multiple images that matches the worker's mood.
  • the answer information is information regarding which image was selected from the multiple images at each of the multiple time points.
  • the memory unit (11) stores the answer information.
  • the feature determination unit (23) determines multiple feature amounts.
  • the regression unit (24) acquires a regression equation expressing the relationship between the multiple feature amounts in the first period and the worker's engagement in the first period, which has been determined in advance.
  • the estimation unit (25) estimates the worker's engagement in the second period based on the multiple feature amounts in the second period different from the first period and the regression equation.
  • the multiple feature amounts include at least one of a feature amount based on the worker's biometric information measured by the biometric information measurement terminal (3), a feature amount based on the worker's position information in the worker's workplace, a feature amount based on relationship information regarding the relationship between the worker and another worker in the workplace, and a feature amount based on the usage history of a computer system (PC8) used by the worker for work.
  • the multiple feature amounts further include a feature amount based on response information.
  • the feature determination unit (23) determines the amount of change in the worker's mood between multiple points in time.
  • the feature based on the response information includes the variability in the amount of change.
  • the engagement estimation system (1) can effectively estimate engagement.
  • Configurations other than the twelfth aspect are not essential to the engagement estimation system (1) and may be omitted as appropriate.
  • various configurations (including modified examples) of the engagement estimation system (1) according to the embodiment can be embodied in an engagement estimation method, a (computer) program, or a non-transitory recording medium having a program recorded thereon.

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The purpose of the present disclosure is to reduce the burden on a worker required to estimate engagement. The present disclosure involves a questioning step in which a plurality of images corresponding one-to-one to a plurality of moods are displayed, and a worker is requested to select an image that matches the worker's mood from among the plurality of images. In a second step, response information is stored that relates to which of the plurality of images was selected at each of a plurality of time points. In a fifth step, the worker's engagement in a second period is estimated on the basis of a plurality of feature quantities in the second period and a regression equation. The plurality of feature quantities include feature quantities based on the response information.

Description

エンゲージメント推定方法、プログラム及びエンゲージメント推定システムEngagement estimation method, program, and engagement estimation system
 本開示は一般にエンゲージメント推定方法、プログラム及びエンゲージメント推定システムに関し、より詳細には、仕事に対する労働者のエンゲージメントを推定するエンゲージメント推定方法、プログラム及びエンゲージメント推定システムに関する。 The present disclosure relates generally to an engagement estimation method, program, and engagement estimation system, and more specifically to an engagement estimation method, program, and engagement estimation system that estimates a worker's engagement with their work.
 労働者の仕事に関連する、エンゲージメント(ワーク・エンゲージメント又は従業員エンゲージメントとも呼ばれる)という指標が知られている。非特許文献1によれば、エンゲージメントは、次の2点から定義される。(1)組織に対するコミットメント、具体的には、感情的コミットメント(組織に対する情動的愛着)及び継続コミットメント(組織に留まっていたいという願望)。(2)役割外行動(組織が効果的に機能できるようにする任意の行動)。 There is a known indicator of engagement (also called work engagement or employee engagement) related to workers' work. According to non-patent document 1, engagement is defined in terms of the following two points: (1) commitment to the organization, specifically, affective commitment (emotional attachment to the organization) and continuance commitment (desire to remain with the organization); and (2) extra-role behavior (any behavior that enables the organization to function effectively).
 エンゲージメントを高めることは、例えば、生産性、販売、顧客満足度、及び、組織への労働者の定着率の向上につながり得る。 Increased engagement can, for example, lead to increased productivity, sales, customer satisfaction, and worker retention within an organization.
 従来、労働者に対するアンケートを実施し、アンケートへの回答を分析することにより、エンゲージメントが定量化されていた(例えば、特許文献1、2)。 Traditionally, engagement has been quantified by conducting surveys of workers and analyzing their responses (e.g., Patent Documents 1 and 2).
 しかしながら、労働者がアンケートに回答するために多くの時間を要したり、アンケートに回答することを労働者が煩わしく思ったりする可能性がある。このように、アンケートへ回答することが、労働者の負担となる可能性がある。 However, it may take a lot of time for workers to answer the questionnaire, or they may find it bothersome. In this way, answering the questionnaire may become a burden for workers.
特開2018-185680号公報JP 2018-185680 A 特開2019-175108号公報JP 2019-175108 A
 本開示は、エンゲージメントを推定するために要する労働者の負担を軽減することができるエンゲージメント推定方法、プログラム及びエンゲージメント推定システムを提供することを目的とする。 The present disclosure aims to provide an engagement estimation method, program, and engagement estimation system that can reduce the burden on workers required to estimate engagement.
 本開示の一態様に係るエンゲージメント推定方法は、エンゲージメント推定システムにより実行され、仕事に対する労働者のエンゲージメントを推定する方法である。前記エンゲージメント推定方法は、第1ステップと、第2ステップと、第3ステップと、第4ステップと、第5ステップと、を有する。前記第1ステップでは、質問ステップを、複数の時点において実施する。前記質問ステップでは、複数の気分と一対一で対応した複数の画像を表示し、前記複数の画像の中から前記労働者の気分として当てはまる画像を前記労働者に選択させる。前記第2ステップでは、前記複数の時点の各々において前記複数の画像のうちいずれの画像が選択されたかに関する回答情報を記憶する。前記第3ステップでは、複数の特徴量を決定する。前記第4ステップでは、第1の期間における前記複数の特徴量と、予め求められた、前記第1の期間における前記労働者の前記エンゲージメントと、の関係を表す回帰式を取得する。前記第5ステップでは、前記第1の期間とは別の第2の期間における前記複数の特徴量と、前記回帰式と、に基づいて、前記第2の期間における前記労働者の前記エンゲージメントを推定する。前記複数の特徴量は、生体情報計測端末により計測される前記労働者の生体情報に基づいた特徴量と、前記労働者の就業場所における、前記労働者の位置情報に基づいた特徴量と、職場における前記労働者と別の労働者との関係性に関する関係性情報に基づいた特徴量と、前記労働者が前記仕事に使用するコンピュータシステムの使用履歴に基づいた特徴量と、のうち、少なくとも1つを含む。前記複数の特徴量は、前記回答情報に基づいた特徴量を更に含む。 An engagement estimation method according to one aspect of the present disclosure is a method executed by an engagement estimation system to estimate a worker's engagement with work. The engagement estimation method has a first step, a second step, a third step, a fourth step, and a fifth step. In the first step, a questioning step is carried out at multiple time points. In the questioning step, multiple images corresponding one-to-one to multiple moods are displayed, and the worker is asked to select an image from the multiple images that matches the worker's mood. In the second step, answer information regarding which image was selected from the multiple images at each of the multiple time points is stored. In the third step, multiple feature amounts are determined. In the fourth step, a regression equation is obtained that represents the relationship between the multiple feature amounts in a first period and the engagement of the worker in the first period that has been determined in advance. In the fifth step, the engagement of the worker in the second period is estimated based on the multiple feature amounts in a second period different from the first period and the regression equation. The multiple feature amounts include at least one of a feature amount based on biometric information of the worker measured by a biometric information measuring terminal, a feature amount based on location information of the worker at the worker's workplace, a feature amount based on relationship information regarding a relationship between the worker and another worker in the workplace, and a feature amount based on a usage history of a computer system used by the worker for the work. The multiple feature amounts further include a feature amount based on the response information.
 本開示の一態様に係るプログラムは、前記エンゲージメント推定方法を、コンピュータシステムの1以上のプロセッサに実行させるためのプログラムである。 A program according to one aspect of the present disclosure is a program for causing one or more processors of a computer system to execute the engagement estimation method.
 本開示の一態様に係るエンゲージメント推定システムは、仕事に対する労働者のエンゲージメントを推定する。前記エンゲージメント推定システムは、取得部と、記憶部と、特徴量決定部と、回帰部と、推定部と、を備える。前記取得部は、労働者用端末から、回答情報を取得する。前記労働者端末は、質問ステップを、複数の時点において実施する。前記質問ステップでは、複数の気分と一対一で対応した複数の画像を表示し、前記複数の画像の中から前記労働者の気分として当てはまる画像を前記労働者に選択させる。前記回答情報は、前記複数の時点の各々において前記複数の画像のうちいずれの画像が選択されたかに関する情報である。前記記憶部は、前記回答情報を記憶する。前記特徴量決定部は、複数の特徴量を決定する。前記回帰部は、第1の期間における前記複数の特徴量と、予め求められた、前記第1の期間における前記労働者の前記エンゲージメントと、の関係を表す回帰式を取得する。前記推定部は、前記第1の期間とは別の第2の期間における前記複数の特徴量と、前記回帰式と、に基づいて、前記第2の期間における前記労働者の前記エンゲージメントを推定する。前記複数の特徴量は、生体情報計測端末により計測される前記労働者の生体情報に基づいた特徴量と、前記労働者の就業場所における、前記労働者の位置情報に基づいた特徴量と、職場における前記労働者と別の労働者との関係性に関する関係性情報に基づいた特徴量と、前記労働者が前記仕事に使用するコンピュータシステムの使用履歴に基づいた特徴量と、のうち、少なくとも1つを含む。前記複数の特徴量は、前記回答情報に基づいた特徴量を更に含む。 An engagement estimation system according to one aspect of the present disclosure estimates a worker's engagement with work. The engagement estimation system includes an acquisition unit, a memory unit, a feature determination unit, a regression unit, and an estimation unit. The acquisition unit acquires answer information from a worker terminal. The worker terminal performs a questioning step at multiple time points. In the questioning step, multiple images corresponding one-to-one to multiple moods are displayed, and the worker is prompted to select an image from the multiple images that matches the worker's mood. The answer information is information regarding which image was selected from the multiple images at each of the multiple time points. The memory unit stores the answer information. The feature determination unit determines multiple feature values. The regression unit acquires a regression equation representing the relationship between the multiple feature values in a first period and the worker's engagement in the first period that has been determined in advance. The estimation unit estimates the engagement of the worker in a second period different from the first period based on the multiple feature amounts in the second period and the regression equation. The multiple feature amounts include at least one of a feature amount based on biometric information of the worker measured by a biometric information measuring terminal, a feature amount based on location information of the worker at the worker's workplace, a feature amount based on relationship information regarding a relationship between the worker and another worker in the workplace, and a feature amount based on a usage history of a computer system used by the worker for the work. The multiple feature amounts further include a feature amount based on the response information.
図1は、一実施形態に係るエンゲージメント推定システム及びこれに関連する構成のブロック図である。FIG. 1 is a block diagram of an engagement estimation system and related components according to one embodiment. 図2は、同上のエンゲージメント推定システムにおける一処理を説明するグラフである。FIG. 2 is a graph illustrating a process in the engagement estimation system. 図3は、同上のエンゲージメント推定システムによるエンゲージメントの推定プロセスを表す説明図である。FIG. 3 is an explanatory diagram showing an engagement estimation process performed by the engagement estimation system. 図4は、同上のエンゲージメント推定システムを用いたエンゲージメント推定方法において、PCに表示される操作ウィンドウを表す説明図である。FIG. 4 is an explanatory diagram showing an operation window displayed on a PC in the engagement estimation method using the engagement estimation system. 図5は、同上のエンゲージメント推定システムで取得される回答情報の一例を表すグラフである。FIG. 5 is a graph showing an example of response information acquired by the engagement estimation system. 図6は、同上のエンゲージメント推定システムで取得される回答情報の変化量の一例を表すグラフである。FIG. 6 is a graph showing an example of the amount of change in response information acquired by the engagement estimation system. 図7は、同上のエンゲージメント推定システムを用いたエンゲージメント推定方法の一部の処理の流れを表すフローチャートである。FIG. 7 is a flowchart showing a process flow of a part of an engagement estimation method using the engagement estimation system.
 (実施形態)
 以下、実施形態に係るエンゲージメント推定方法、プログラム及びエンゲージメント推定システム1について、図面を用いて説明する。ただし、下記の実施形態は、本開示の様々な実施形態の1つに過ぎない。下記の実施形態は、本開示の目的を達成できれば、設計等に応じて種々の変更が可能である。
(Embodiment)
Hereinafter, an engagement estimation method, a program, and an engagement estimation system 1 according to an embodiment will be described with reference to the drawings. However, the embodiment described below is merely one of various embodiments of the present disclosure. The embodiment described below can be modified in various ways depending on the design, etc., as long as the object of the present disclosure can be achieved.
 (概要)
 図1に、本実施形態のエンゲージメント推定システム1の構成を概略的に示す。エンゲージメント推定システム1は、労働者のエンゲージメントを推定するために使用される。本開示で言う「労働者」とは、労働をする者全般を指す。本開示で言う「労働者」は、一般的な意味で言う「労働者」とは異なり、労働の対価として報酬を受け取る者だけではなく、無報酬で労働をする者をも指す。ただし、以下では、代表例として、労働の対価として報酬を受け取る者のエンゲージメントを推定する場合について説明する。よって、エンゲージメント推定システム1は、例えば、企業、役所又は団体において使用される。以下では、代表例として、エンゲージメント推定システム1が企業において使用される場合について説明する。
(overview)
FIG. 1 shows a schematic configuration of an engagement estimation system 1 of this embodiment. The engagement estimation system 1 is used to estimate the engagement of workers. In the present disclosure, "worker" refers to anyone who works in general. Unlike "laborer" in the general sense, "worker" in the present disclosure refers not only to those who receive remuneration in exchange for labor, but also to those who work without remuneration. However, in the following, as a representative example, a case where the engagement of a person who receives remuneration in exchange for labor is estimated will be described. Therefore, the engagement estimation system 1 is used, for example, in companies, government offices, or organizations. In the following, as a representative example, a case where the engagement estimation system 1 is used in a company will be described.
 図1に示す、本実施形態のエンゲージメント推定システム1は、仕事に対する労働者のエンゲージメントを推定する。エンゲージメント推定システム1は、(第1)取得部21と、記憶部11と、特徴量決定部23と、回帰部24と、推定部25と、を備える。取得部21は、労働者用端末(PC8)から、回答情報を取得する。労働者端末は、質問ステップを、複数の時点において実施する。質問ステップでは、複数の気分と一対一で対応した複数の画像を表示し(図4参照)、複数の画像の中から労働者の気分として当てはまる画像を労働者に選択させる。回答情報は、複数の時点の各々において複数の画像のうちいずれの画像が選択されたかに関する情報である。記憶部11は、回答情報を記憶する。特徴量決定部23は、複数の特徴量を決定する。回帰部24は、第1の期間(図3参照)における複数の特徴量と、予め求められた、第1の期間における労働者のエンゲージメントと、の関係を表す回帰式(図2参照)を取得する。推定部25は、第1の期間とは別の第2の期間(図3参照)における複数の特徴量と、回帰式と、に基づいて、第2の期間における労働者のエンゲージメントを推定する。複数の特徴量は、生体情報計測端末3により計測される労働者の生体情報に基づいた特徴量と、労働者の就業場所における、労働者の位置情報に基づいた特徴量と、職場における労働者と別の労働者との関係性に関する関係性情報に基づいた特徴量と、労働者が仕事に使用するコンピュータシステム(PC8)の使用履歴に基づいた特徴量と、のうち、少なくとも1つを含む。複数の特徴量は、回答情報に基づいた特徴量を更に含む。 Engagement estimation system 1 of this embodiment shown in FIG. 1 estimates a worker's engagement with work. Engagement estimation system 1 includes a (first) acquisition unit 21, a memory unit 11, a feature determination unit 23, a regression unit 24, and an estimation unit 25. The acquisition unit 21 acquires answer information from a worker terminal (PC 8). The worker terminal performs a question step at multiple time points. In the question step, multiple images corresponding one-to-one to multiple moods are displayed (see FIG. 4), and the worker is prompted to select an image that corresponds to the worker's mood from the multiple images. The answer information is information regarding which image was selected from the multiple images at each of the multiple time points. The memory unit 11 stores the answer information. The feature determination unit 23 determines multiple feature values. The regression unit 24 acquires a regression equation (see FIG. 2) that represents the relationship between the multiple feature values in a first period (see FIG. 3) and the worker's engagement in the first period that has been determined in advance. The estimation unit 25 estimates the worker's engagement in a second period (see FIG. 3) different from the first period, based on a plurality of feature amounts in the second period and a regression equation. The plurality of feature amounts include at least one of a feature amount based on the worker's biometric information measured by the biometric information measurement terminal 3, a feature amount based on the worker's location information in the worker's workplace, a feature amount based on relationship information regarding the relationship between the worker and another worker in the workplace, and a feature amount based on the usage history of a computer system (PC 8) used by the worker for work. The plurality of feature amounts further includes a feature amount based on response information.
 本実施形態によれば、複数の画像の中から労働者の気分として当てはまる画像を労働者が選択することで、エンゲージメント推定システム1によりエンゲージメントが推定される。労働者は、現在の自分の気分に当てはまる画像を直感的に選択すればよい。そのため、回答に要する労働者の負担が軽減される。例えば、質問文が表示されて、その質問文を労働者が読んだ上で、労働者が回答を行う場合と比較して、本実施形態では労働者の負担が軽減される。 In this embodiment, the worker selects an image that fits the worker's mood from among multiple images, and the engagement estimation system 1 estimates the worker's engagement. The worker can intuitively select an image that fits their current mood. This reduces the burden on the worker in answering questions. For example, in this embodiment, the burden on the worker is reduced compared to when a question is displayed, the worker reads the question, and then the worker answers.
 また、本実施形態によれば、質問への回答のみに基づいてエンゲージメントを推定する場合と比較して、労働者の生体情報、位置情報、関係性情報及びコンピュータシステムの使用履歴のうち少なくとも1つを特徴量として用いることで、より客観的にエンゲージメントを推定することができる。また、労働者の生体情報、位置情報、関係性情報及びコンピュータシステムの使用履歴のうち少なくとも1つを用いることで、労働者の気分を尋ねる以外の質問の項目数を少なく済ませることや、労働者の気分を尋ねる以外の質問を実施せずにエンゲージメントを推定することも可能となる。そのため、質問に回答する人(労働者等)の負担を軽減することができる。 Furthermore, according to this embodiment, by using at least one of the worker's biometric information, location information, relationship information, and computer system usage history as a feature, it is possible to estimate engagement more objectively than when engagement is estimated based solely on responses to questions. Furthermore, by using at least one of the worker's biometric information, location information, relationship information, and computer system usage history, it is possible to reduce the number of questions other than those asking about the worker's mood, and to estimate engagement without asking any questions other than those asking about the worker's mood. This reduces the burden on people (workers, etc.) who answer questions.
 また、本実施形態のエンゲージメント推定方法は、エンゲージメント推定システム1により実行され、仕事に対する労働者のエンゲージメントを推定する方法である。エンゲージメント推定方法は、第1ステップと、第2ステップと、第3ステップと、第4ステップと、第5ステップと、を有する。第1ステップでは、質問ステップを、複数の時点において実施する。質問ステップでは、複数の気分と一対一で対応した複数の画像を表示し、複数の画像の中から労働者の気分として当てはまる画像を労働者に選択させる。第2ステップでは、複数の時点の各々において複数の画像のうちいずれの画像が選択されたかに関する回答情報を記憶する。第3ステップでは、複数の特徴量を決定する。第4ステップでは、第1の期間における複数の特徴量と、予め求められた、第1の期間における労働者のエンゲージメントと、の関係を表す回帰式を取得する。第5ステップでは、第1の期間とは別の第2の期間における複数の特徴量と、回帰式と、に基づいて、第2の期間における労働者のエンゲージメントを推定する。複数の特徴量は、生体情報計測端末3により計測される労働者の生体情報に基づいた特徴量と、労働者の就業場所における、労働者の位置情報に基づいた特徴量と、職場における労働者と別の労働者との関係性に関する関係性情報に基づいた特徴量と、労働者が仕事に使用するコンピュータシステム(PC8)の使用履歴に基づいた特徴量と、のうち、少なくとも1つを含む。複数の特徴量は、回答情報に基づいた特徴量を更に含む。 The engagement estimation method of this embodiment is executed by the engagement estimation system 1, and is a method for estimating a worker's engagement with work. The engagement estimation method has a first step, a second step, a third step, a fourth step, and a fifth step. In the first step, a questioning step is performed at multiple time points. In the questioning step, multiple images corresponding one-to-one to multiple moods are displayed, and the worker is asked to select an image that corresponds to the worker's mood from the multiple images. In the second step, answer information regarding which image was selected from the multiple images at each of the multiple time points is stored. In the third step, multiple feature amounts are determined. In the fourth step, a regression equation is obtained that represents the relationship between the multiple feature amounts in the first period and the worker's engagement in the first period that is determined in advance. In the fifth step, the worker's engagement in the second period is estimated based on the multiple feature amounts in a second period different from the first period and the regression equation. The multiple feature amounts include at least one of the following: a feature amount based on the worker's biometric information measured by the biometric information measuring terminal 3; a feature amount based on the worker's location information at the worker's workplace; a feature amount based on relationship information regarding the relationship between the worker and other workers at the workplace; and a feature amount based on the usage history of a computer system (PC8) used by the worker for work. The multiple feature amounts further include a feature amount based on response information.
 また、エンゲージメント推定方法は、プログラムにて具現化可能である。本実施形態のプログラムは、エンゲージメント推定方法を、コンピュータシステムの1以上のプロセッサに実行させるためのプログラムである。プログラムは、コンピュータシステムで読み取り可能な非一時的記録媒体に記録されていてもよい。 The engagement estimation method can also be embodied in a program. The program of this embodiment is a program for causing one or more processors of a computer system to execute the engagement estimation method. The program may be recorded on a non-transitory recording medium that can be read by the computer system.
 (詳細)
 (1)全体構成
 以下、エンゲージメント推定システム1及びこれに関連する各構成について、より詳細に説明する。
(detail)
(1) Overall Configuration The engagement estimation system 1 and each of the related components will be described in more detail below.
 本実施形態では、労働者は複数存在する。エンゲージメント推定システム1は、複数の労働者の各々のエンゲージメントを推定する。 In this embodiment, there are multiple workers. The engagement estimation system 1 estimates the engagement of each of the multiple workers.
 エンゲージメント推定システム1は、例えば、生体情報計測端末3、位置計測システム4、データサーバ5、操作端末6、情報処理サーバ7、PC(パーソナルコンピュータ)8、勤怠管理システム9、及び、運動計測端末10と共に使用される。 The engagement estimation system 1 is used together with, for example, a bio-information measuring terminal 3, a position measuring system 4, a data server 5, an operation terminal 6, an information processing server 7, a PC (personal computer) 8, an attendance management system 9, and an exercise measuring terminal 10.
 (2)生体情報計測端末
 生体情報計測端末3は、複数の労働者の各々の生体情報を計測する。生体情報は、例えば、心拍数、血圧、皮膚温、発汗量、及び、音声情報のうち少なくとも1つを含む。1つの生体情報計測端末3により、複数種類の生体情報(例えば、心拍数と血圧)が計測されてもよい。あるいは、複数の生体情報計測端末3が存在して、複数の生体情報計測端末3がそれぞれ、異なる種類の生体情報を計測してもよい。
(2) Biological Information Measurement Terminal The biological information measurement terminal 3 measures biological information of each of the multiple workers. The biological information includes, for example, at least one of heart rate, blood pressure, skin temperature, sweat rate, and voice information. A single biological information measurement terminal 3 may measure multiple types of biological information (for example, heart rate and blood pressure). Alternatively, there may be multiple biological information measurement terminals 3, and each of the multiple biological information measurement terminals 3 may measure a different type of biological information.
 生体情報計測端末3は、例えば、労働者が装着するウェアラブル端末である。ウェアラブル端末は、例えば、光学式心拍センサを備え、光学式心拍センサにより労働者の心拍数と血圧とを計測する。また、ウェアラブル端末は、例えば、温度センサを備え、温度センサにより労働者の皮膚温を計測する。また、ウェアラブル端末は、例えば、汗センサを備え、汗センサにより、労働者の発汗量を計測する。 The bioinformation measuring terminal 3 is, for example, a wearable terminal worn by a worker. The wearable terminal is equipped with, for example, an optical heart rate sensor, which measures the worker's heart rate and blood pressure. The wearable terminal is also equipped with, for example, a temperature sensor, which measures the worker's skin temperature. The wearable terminal is also equipped with, for example, a sweat sensor, which measures the amount of sweat produced by the worker.
 別の一例として、生体情報計測端末3は、例えば、カメラ(近赤外カメラ等)で労働者を一定期間撮像して画像データを生成し、画像データに基づいて労働者の心拍数を計測する。 As another example, the bio-information measuring terminal 3 captures an image of a worker for a certain period of time using a camera (such as a near-infrared camera) to generate image data, and measures the worker's heart rate based on the image data.
 別の一例として、生体情報計測端末3は、例えば、腕帯を有した血圧計であり、労働者の腕に腕帯が巻かれた状態で、労働者の血圧を計測する。 As another example, the bio-information measuring terminal 3 is, for example, a blood pressure monitor with an arm band, and measures the worker's blood pressure while the arm band is wrapped around the worker's arm.
 別の一例として、生体情報計測端末3は、例えば、マイクロフォンを備え、マイクロフォンにより労働者の音声を、電気信号の形式の音声情報に変換する。マイクロフォンは、ウェアラブル端末に備えられていてもよい。 As another example, the bioinformation measuring terminal 3 is provided with a microphone, for example, and converts the voice of the worker into audio information in the form of an electrical signal using the microphone. The microphone may be provided in a wearable terminal.
 (3)位置計測システム
 位置計測システム4は、複数の労働者の各々の位置情報を計測する。位置情報は、例えば、複数の労働者の各々の座標の情報を含む。
(3) Position Measurement System The position measurement system 4 measures position information of each of the multiple workers. The position information includes, for example, coordinate information of each of the multiple workers.
 例えば、複数の労働者の各々は、スマートフォン又はウェアラブル端末等の携帯端末を携帯している。複数の労働者の就業場所(例えば、オフィスビル、店舗又は工場)には、複数のビーコン装置が設置されている。 For example, each of the multiple workers carries a mobile terminal such as a smartphone or a wearable terminal. Multiple beacon devices are installed in the workplaces of the multiple workers (e.g., office buildings, stores, or factories).
 以下では、複数の労働者のうち1人の労働者に着目して、この労働者の位置情報の計測例を説明する。また、他の労働者の位置情報も同様に計測することができる。 Below, we will focus on one worker among multiple workers and explain an example of measuring the location information of this worker. The location information of other workers can also be measured in a similar manner.
 複数のビーコン装置の各々は、ビーコン信号を発信する。労働者が携帯している携帯端末は、ビーコン信号の受信信号強度を計測する。受信信号強度の情報は、携帯端末から位置計測システム4へ送信される。位置計測システム4は、受信信号強度に基づいて、携帯端末と複数のビーコン装置の各々との間の距離を算出する。さらに、位置計測システム4は、携帯端末と複数のビーコン装置の各々との間の距離と、複数のビーコン装置の各々の位置情報とに基づいて、3点測位により、携帯端末の位置情報を計測する。位置計測システム4は、携帯端末の位置情報を、携帯端末を携帯している労働者の位置情報として、エンゲージメント推定システム1へ送信する。 Each of the multiple beacon devices transmits a beacon signal. A mobile device carried by the worker measures the received signal strength of the beacon signal. Information on the received signal strength is transmitted from the mobile device to the position measurement system 4. The position measurement system 4 calculates the distance between the mobile device and each of the multiple beacon devices based on the received signal strength. Furthermore, the position measurement system 4 measures the position information of the mobile device by three-point positioning based on the distance between the mobile device and each of the multiple beacon devices and the position information of each of the multiple beacon devices. The position measurement system 4 transmits the position information of the mobile device to the engagement estimation system 1 as position information of the worker carrying the mobile device.
 なお、ビーコン信号を受信する携帯端末(例えば、ウェアラブル端末)は、生体情報計測端末3を兼ねていてもよい。 The mobile terminal (e.g., a wearable terminal) that receives the beacon signal may also serve as the biological information measurement terminal 3.
 (4)データサーバ
 データサーバ5は、関係性情報を記憶している。関係性情報は、複数の労働者の職場(企業、役所又は団体等)における、上記複数の労働者の互いの関係性に関する情報である。より詳細には、関係性情報は、例えば、複数の労働者の上下関係に関する情報を含む。複数の労働者の上下関係に関する情報は、例えば、複数の労働者の各々の職位の情報である。職位は、仕事上の地位を指す。職位は、役職又は階級等である。また、関係性情報は、例えば、複数の労働者の各々が所属している組織(部署又は部門等)に関する、組織情報を含む。部署又は部門は、例えば、××部、××課、又は、××センター等の名称で区別される。また、関係性情報は、例えば、複数の労働者の各々が携わっている業務(プロジェクト等)の識別に関する業務情報(例えば、携わっている業務の名称の情報)を含む。
(4) Data Server The data server 5 stores relationship information. The relationship information is information about the relationships between the multiple workers in their workplaces (such as a company, a government office, or an organization). More specifically, the relationship information includes, for example, information about the hierarchical relationships between the multiple workers. The information about the hierarchical relationships between the multiple workers is, for example, information about the job titles of each of the multiple workers. The job title refers to a work position. The job title is a position or a rank. The relationship information also includes, for example, organizational information about the organization (department or division, etc.) to which each of the multiple workers belongs. The department or division is distinguished by the name, for example, XX Department, XX Section, or XX Center. The relationship information also includes, for example, business information (for example, information about the name of the business) related to the identification of the business (project, etc.) in which each of the multiple workers is involved.
 また、データサーバ5は、エリア情報を記憶している。エリア情報は、例えば、複数の労働者の就業場所の地図情報を含む。エリア情報は、例えば、各部屋の位置と、各部屋の用途と、の情報を含む。 The data server 5 also stores area information. The area information includes, for example, map information of the workplaces of multiple workers. The area information includes, for example, information on the location of each room and the purpose of each room.
 (5)操作端末
 操作端末6は、例えば、パーソナルコンピュータ、又は、携帯端末である。携帯端末は、例えば、スマートフォン等の携帯電話、ウェアラブル端末、又は、タブレット端末である。
(5) Operation Terminal The operation terminal 6 is, for example, a personal computer or a mobile terminal. The mobile terminal is, for example, a mobile phone such as a smartphone, a wearable terminal, or a tablet terminal.
 操作端末6は、人の操作に応じて、仕事に関する労働者の申告情報を生成する。労働者の申告情報を操作端末6で生成するために、労働者自身が操作端末6を操作してもよいし、別の者が操作端末6を操作してもよい。 The operation terminal 6 generates the worker's work-related reporting information in response to human operation. In order to generate the worker's reporting information on the operation terminal 6, the worker himself/herself may operate the operation terminal 6, or another person may operate the operation terminal 6.
 操作端末6は、例えば、タッチパネルディスプレイを含み、タッチパネルディスプレイにアンケート項目を表示する。人は、タッチパネルディスプレイを操作することで、アンケート項目に回答する。すると、操作端末6は、人から得た回答を含んだ申告情報を生成する。 The operation terminal 6 includes, for example, a touch panel display, and displays survey items on the touch panel display. A person answers the survey items by operating the touch panel display. The operation terminal 6 then generates declaration information that includes the answers obtained from the person.
 人は、例えば、アンケート項目として示される質問に対して、複数の選択肢の中から回答を選択する。複数の選択肢は、例えば、「そう思う」、「ややそう思う」、「どちらとも言えない」、「ややそう思わない」、「そう思わない」の5つの選択肢である。 For example, when a question is presented as a questionnaire item, a person selects an answer from among multiple options. The multiple options are, for example, five options: "Agree," "Somewhat agree," "Can't say," "Somewhat disagree," and "Disagree."
 アンケート項目は、例えば、情報処理サーバ7で上述の第1の期間のエンゲージメントを推定するための質問を含む。このような質問を、以下では「第1質問」と呼び、第1質問への回答を、以下では「第1回答」と呼ぶ。第1質問は、例えば、労働者の勤務先、仕事の内容、及び、同僚に対する、労働者の考え方及び感じ方を問う質問である。 The questionnaire items include, for example, questions for estimating engagement in the first period described above in the information processing server 7. Such questions are hereinafter referred to as "first questions", and answers to the first questions are hereinafter referred to as "first answers". The first questions are, for example, questions asking about the worker's place of employment, the content of the work, and the worker's thoughts and feelings about their colleagues.
 また、アンケート項目は、例えば、エンゲージメント推定システム1で第1の期間とは別の第2の期間におけるエンゲージメントを推定するための質問を含む。このような質問を、以下では「第2質問」と呼び、第2質問への回答を、以下では「第2回答」と呼ぶ。少なくとも1つの第2質問が、第1質問と同じ質問であってもよい。第2質問の項目数(質問数)は、第1質問の項目数(質問数)よりも少ないことが好ましい。 The questionnaire items also include, for example, questions for estimating engagement in a second period different from the first period by the engagement estimation system 1. Such questions are hereinafter referred to as "second questions," and answers to the second questions are hereinafter referred to as "second answers." At least one second question may be the same as the first question. It is preferable that the number of items (number of questions) of the second questions is less than the number of items (number of questions) of the first questions.
 (6)情報処理サーバ
 情報処理サーバ7は、第1の期間における労働者のエンゲージメントを推定する。より詳細には、情報処理サーバ7は、まず、操作端末6から申告情報を取得する。申告情報は、少なくとも1つの第1回答を含む。少なくとも1つの第1回答に基づいて、情報処理サーバ7は、第1の期間における労働者のエンゲージメントを推定する。
(6) Information Processing Server The information processing server 7 estimates the engagement of the worker in a first period. More specifically, the information processing server 7 first acquires declaration information from the operation terminal 6. The declaration information includes at least one first response. Based on the at least one first response, the information processing server 7 estimates the engagement of the worker in the first period.
 エンゲージメントは、例えば、数値で表される。情報処理サーバ7によるエンゲージメントの推定方法としては、特許文献1に開示されているような既知の方法を採用することができる。例えば、複数の選択肢の中からどれを第1回答として選択したかによって、第1回答に対するスコアが決定される。情報処理サーバ7は、複数の第1回答の各々に対するスコアの合計を、第1の期間における労働者のエンゲージメントと推定する。 Engagement is expressed, for example, as a numerical value. A known method such as that disclosed in Patent Document 1 can be adopted as a method for the information processing server 7 to estimate engagement. For example, a score for the first answer is determined depending on which of multiple options is selected as the first answer. The information processing server 7 estimates the sum of the scores for each of the multiple first answers as the worker's engagement in the first period.
 (7)PC
 労働者は、仕事にPC(パーソナルコンピュータ)8を使用する。より詳細には、複数の労働者は、例えば、1人1台以上のPC8を勤務先から割り当てられている。複数の労働者の各々は、自分に割り当てられた1台以上のPC8を使用する。
(7) PC
The workers use PCs (personal computers) 8 for work. More specifically, each of the workers is assigned one or more PCs 8 by his/her workplace. Each of the workers uses the one or more PCs 8 assigned to him/her.
 PC8は、自機の使用履歴を記憶する記憶装置として、記憶部84を備えている。記憶部84は、ハードディスクドライブ(HDD)又はソリッドステートドライブ(SSD)等である。PC8の使用履歴を取得して記憶部84に記憶するためのソフトウェアが、PC8にインストールされていてもよい。 PC8 is equipped with a memory unit 84 as a storage device that stores the usage history of the PC8. The memory unit 84 is a hard disk drive (HDD) or a solid state drive (SSD) or the like. Software for acquiring the usage history of PC8 and storing it in the memory unit 84 may be installed in PC8.
 また、PC8は、処理部81と、表示部82と、操作部83と、を備えている。 The PC 8 also includes a processing unit 81, a display unit 82, and an operation unit 83.
 処理部81は、1以上のプロセッサ及びメモリを有するコンピュータシステムを含んでいる。コンピュータシステムのメモリに記録されたプログラムを、コンピュータシステムのプロセッサが実行することにより、処理部81の機能が実現される。プログラムは、メモリに記録されていてもよいし、インターネット等の電気通信回線を通して提供されてもよく、メモリカード等の非一時的記録媒体に記録されて提供されてもよい。 The processing unit 81 includes a computer system having one or more processors and a memory. The functions of the processing unit 81 are realized by the processor of the computer system executing a program recorded in the memory of the computer system. The program may be recorded in the memory, or may be provided via a telecommunications line such as the Internet, or may be recorded on a non-transitory recording medium such as a memory card and provided.
 表示部82は、情報を表示するためのディスプレイを有する。 The display unit 82 has a display for displaying information.
 操作部83は、人の操作を受け付ける。操作部83は、例えば、マウス、キーボード、釦、及び、タッチパネル等のうち、少なくとも1つを有する。 The operation unit 83 accepts operations by a person. The operation unit 83 has at least one of, for example, a mouse, a keyboard, a button, and a touch panel.
 処理部81は、表示部82、操作部83及び記憶部84を制御する。所定の時間帯になると、処理部81は、複数の画像の中から労働者の気分として当てはまる画像を労働者に選択させるための質問を表す操作ウィンドウ820(図4参照)を、表示部82に表示させる。労働者は、操作部83を操作して、いずれか1つの画像を選択することにより、質問に回答することができる。複数の画像のうちいずれの画像が選択されたかに関する回答情報は、記憶部84に記憶される。 The processing unit 81 controls the display unit 82, the operation unit 83, and the memory unit 84. At a predetermined time, the processing unit 81 causes the display unit 82 to display an operation window 820 (see FIG. 4) that displays a question for prompting the worker to select an image from a plurality of images that fits the worker's mood. The worker can answer the question by operating the operation unit 83 to select one of the images. Answer information regarding which image from the plurality of images was selected is stored in the memory unit 84.
 PC8には、操作ウィンドウ820を表示部82に表示させていずれかの画像を労働者に選択させる質問ステップを実行するためのソフトウェアが、予めインストールされている。ソフトウェアは、エンゲージメント推定システム1から有線通信又は無線通信によりPC8へ提供されてもよいし、他の装置からPC8へ提供されてもよい。  PC8 is pre-installed with software for executing a question step that displays an operation window 820 on the display unit 82 and prompts the worker to select one of the images. The software may be provided to PC8 from engagement estimation system 1 via wired or wireless communication, or may be provided to PC8 from another device.
 処理部81は、質問ステップを、複数の時点において実施する。より詳細には、処理部81は、質問ステップを、第1の期間の複数の時点と、第2の期間の複数の時点と、において実施する。 The processing unit 81 performs the questioning step at multiple time points. More specifically, the processing unit 81 performs the questioning step at multiple time points in the first period and at multiple time points in the second period.
 一例として、複数の時点は、朝と夕方との2つの時点である。つまり、処理部81は、質問ステップを、第1の期間の朝と夕方と、第2の期間の朝と夕方と、において実施する。より詳細には、一例として、複数の時点は、労働者が出勤した時点と、労働者が退勤する直前の時点と、である。このように、本実施形態のエンゲージメント推定方法において、第1ステップでは、質問ステップを、1日のうちの複数の時間帯に実施する。 As an example, the multiple time points are two time points, morning and evening. That is, the processing unit 81 performs the questioning step in the morning and evening of the first period, and in the morning and evening of the second period. In more detail, as an example, the multiple time points are the time when the worker arrives at work and the time immediately before the worker leaves work. Thus, in the engagement estimation method of this embodiment, in the first step, the questioning step is performed at multiple time periods in one day.
 操作ウィンドウ820は、一例として、朝と夕方との所定の時間帯に、ポップアップとして自動的に表示される。労働者は、マウスカーソルを移動させて操作ウィンドウ820の所定の領域をクリックすることで、処理部81に処理を実行させることができる。 As an example, the operation window 820 is automatically displayed as a pop-up during specified time periods in the morning and evening. A worker can cause the processing unit 81 to execute processing by moving the mouse cursor and clicking a specified area of the operation window 820.
 本実施形態のエンゲージメント推定方法では、第1ステップにおいて表示される複数の画像の各々は、人の表情を示す画像である。すなわち、図4に示すように、操作ウィンドウ820には、人の表情を表す複数の画像が表示される。より詳細には、複数の画像はそれぞれ、イラストである。図4では、左から右へ順に、とても良い気分(very good)を表す画像、良い気分(good)を表す画像、悪い気分(bad)を表す画像、とても悪い気分(very bad)を表す画像が表示されている。また、各画像の近傍(下方)には、各画像が表す気分が文字により表示されている。 In the engagement estimation method of this embodiment, each of the multiple images displayed in the first step is an image showing a person's facial expression. That is, as shown in FIG. 4, multiple images showing people's facial expressions are displayed in the operation window 820. More specifically, each of the multiple images is an illustration. In FIG. 4, displayed from left to right are an image showing a very good mood (very good), an image showing a good mood (good), an image showing a bad mood (bad), and an image showing a very bad mood (very bad). In addition, the mood that each image represents is displayed in text near (below) each image.
 操作ウィンドウ820が表示されたとき、労働者は、いずれかの画像をクリックする。すると、処理部81は、クリックされた画像が選択されたと判断し、回答情報を生成する。 When the operation window 820 is displayed, the worker clicks on one of the images. The processing unit 81 then determines that the clicked image has been selected and generates answer information.
 回答情報は、1、2、3又は4の数値で表される。すなわち、とても良い気分(very good)を表す画像が選択されると、回答情報は1となる。良い気分(good)を表す画像が選択されると、回答情報は2となる。悪い気分(bad)を表す画像が選択されると、回答情報は3となる。とても悪い気分(very bad)を表す画像が選択されると、回答情報は4となる。 The answer information is represented by a number: 1, 2, 3, or 4. That is, when an image that represents a very good mood is selected, the answer information is 1. When an image that represents a good mood is selected, the answer information is 2. When an image that represents a bad mood is selected, the answer information is 3. When an image that represents a very bad mood is selected, the answer information is 4.
 また、図4に示すように、操作ウィンドウ820には、最小化釦821、最大化釦822、及び、閉じる釦823が表示される。最小化釦821がクリックされると、操作ウィンドウ820が非表示となり、その後、タスクバーに表示されたアイコンがクリックされると、操作ウィンドウ820が再び表示される。最大化釦822がクリックされると、操作ウィンドウ820が表示部82の画面全体に表示される。閉じる釦823がクリックされると、質問ステップを実行するためのソフトウェアが終了して操作ウィンドウ820が非表示となり、その後、質問ステップを実行するためのソフトウェアを再起動する操作がされると、操作ウィンドウ820が再び表示される。 As shown in FIG. 4, the operation window 820 displays a minimize button 821, a maximize button 822, and a close button 823. When the minimize button 821 is clicked, the operation window 820 is hidden, and when an icon displayed on the task bar is then clicked, the operation window 820 is displayed again. When the maximize button 822 is clicked, the operation window 820 is displayed over the entire screen of the display unit 82. When the close button 823 is clicked, the software for executing the question step is terminated and the operation window 820 is hidden, and when an operation is then performed to restart the software for executing the question step, the operation window 820 is displayed again.
 また、操作ウィンドウ820を表示部82に表示してから一定時間内にいずれの画像も選択されなかった場合には、処理部81は、画像の選択を促す通知を行う。通知は、例えば、選択を促すメッセージを、表示部82にポップアップで表示することにより行われる。このように、本実施形態のエンゲージメント推定方法は、第1ステップにおいて労働者の気分として当てはまる画像が選択されなかった場合に、選択を促す通知を行う通知ステップを有する。また、処理部81は、選択を促す通知を行ってから一定時間が経過すると、選択を促す通知を再び行う。すなわち、本実施形態のエンゲージメント推定方法では、第1ステップにおいて労働者の気分として当てはまる画像が選択されなかった場合に、選択がされるまで一定時間ごとに、選択を促す通知を行う。 Furthermore, if no image is selected within a certain time after the operation window 820 is displayed on the display unit 82, the processing unit 81 issues a notification urging the selection of an image. The notification is issued, for example, by displaying a message prompting the selection as a pop-up on the display unit 82. In this way, the engagement estimation method of this embodiment has a notification step of issuing a notification prompting the selection if an image that matches the worker's mood is not selected in the first step. Furthermore, the processing unit 81 issues a notification prompting the selection again when a certain time has passed since issuing the notification prompting the selection. In other words, in the engagement estimation method of this embodiment, if an image that matches the worker's mood is not selected in the first step, a notification prompting the selection is issued at certain time intervals until a selection is made.
 PC8は、労働者が仕事に使用するコンピュータシステムの一例である。コンピュータシステムは、1以上のコンピュータを含む。労働者が仕事に使用するコンピュータシステムは、PC8に限定されず、例えば、スマートフォン等の携帯電話、タブレット端末、又は、ホストコンピュータであってもよい。また、労働者が仕事に使用するコンピュータシステムは、例えば、乗り物又は工作機械等の操作対象を操作するためのコンピュータシステムであってもよく、使用履歴は、操作対象の操作履歴を含んでいてもよい。 PC8 is an example of a computer system that a worker uses for work. A computer system includes one or more computers. The computer system that a worker uses for work is not limited to PC8, and may be, for example, a mobile phone such as a smartphone, a tablet terminal, or a host computer. In addition, the computer system that a worker uses for work may be, for example, a computer system for operating an object to be operated, such as a vehicle or a machine tool, and the usage history may include the operation history of the object to be operated.
 また、PC8は、複数の時点において上述の質問ステップを実施する労働者用端末の一例である。労働者端末は、PC8に限定されず、例えば、スマートフォン等の携帯電話、又は、タブレット端末であってもよい。 The PC 8 is also an example of a worker terminal that performs the above-mentioned questioning steps at multiple points in time. The worker terminal is not limited to the PC 8, and may be, for example, a mobile phone such as a smartphone, or a tablet terminal.
 本実施形態では、操作端末6においてアンケート(質問)が提示され、労働者又は別の者が、アンケートに回答する。これにより、操作端末6において申告情報が生成される。また、本実施形態では、PC8において、複数の画像の中から労働者の気分として当てはまる画像を労働者に選択させるための質問が提示され、労働者自身が、質問に回答する。これにより、PC8において、労働者の気分を表す回答情報が生成される。すなわち、本実施形態では、操作端末6とPC8とのそれぞれにおいて質問が提示される。ただし、一変形例として、PC8が操作端末6を兼ねていてもよい。つまり、本実施形態において操作端末6において提示される質問を、PC8が提示してもよい。あるいは、本実施形態においてPC8において提示される質問を、操作端末6が提示してもよい。 In this embodiment, a questionnaire (question) is presented on the operation terminal 6, and the worker or another person answers the questionnaire. As a result, declaration information is generated on the operation terminal 6. Also, in this embodiment, a question is presented on the PC 8 to prompt the worker to select an image that matches the worker's mood from among multiple images, and the worker himself answers the question. As a result, answer information representing the worker's mood is generated on the PC 8. That is, in this embodiment, questions are presented on both the operation terminal 6 and the PC 8. However, as a variation, the PC 8 may also function as the operation terminal 6. That is, the questions presented on the operation terminal 6 in this embodiment may be presented by the PC 8. Alternatively, the questions presented on the PC 8 in this embodiment may be presented by the operation terminal 6.
 (8)勤怠管理システム
 勤怠管理システム9は、複数の労働者の各々の勤怠情報を生成する。例えば、複数の労働者の各々は、携帯端末(スマートフォン若しくはウェアラブル端末等)又はICカード等の被読取装置を携帯しており、出勤時及び退勤時に、勤怠管理システム9の読取装置に被読取装置をかざす。すると、勤怠管理システム9は被読取装置から、被読取装置に記憶された識別情報を読み取る。これにより、勤怠管理システム9は、複数の労働者の各々の勤怠情報を生成し、勤怠情報は、出勤時刻及び退勤時刻の情報を含む。
(8) Attendance Management System The attendance management system 9 generates attendance information for each of a plurality of workers. For example, each of the plurality of workers carries a readable device such as a mobile terminal (such as a smartphone or wearable terminal) or an IC card, and holds the readable device over the reading device of the attendance management system 9 when arriving at and leaving work. The attendance management system 9 then reads the identification information stored in the readable device from the readable device. As a result, the attendance management system 9 generates attendance information for each of a plurality of workers, and the attendance information includes information on the arrival time and the leaving time.
 (9)運動計測端末
 運動計測端末10は、労働者の運動指標を求める。運動指標は、運動の質及び量の少なくとも一方を表す。運動指標は、例えば、活動量、及び、移動量の少なくとも一方を含む。活動量は、例えば、METs(Metabolic equivalents)で表される。移動量は、例えば、歩数である。
(9) Exercise Measurement Terminal The exercise measurement terminal 10 obtains an exercise index of the worker. The exercise index indicates at least one of the quality and quantity of exercise. The exercise index includes, for example, at least one of the amount of activity and the amount of movement. The amount of activity is expressed, for example, in METs (Metabolic equivalents). The amount of movement is, for example, the number of steps.
 運動計測端末10は、例えば、ウェアラブル端末である。労働者は、ウェアラブル端末を携帯する。ウェアラブル端末は、例えば、歩数計を備え、労働者の歩数を計測する。また、ウェアラブル端末は、例えば、上述したように労働者の生体情報(心拍数、血圧、皮膚温又は発汗量等)を計測する。ウェアラブル端末は、生体情報に基づいて、労働者の活動量を求める。 The exercise measuring terminal 10 is, for example, a wearable terminal. The worker carries the wearable terminal. The wearable terminal is equipped with, for example, a pedometer and measures the number of steps taken by the worker. The wearable terminal also measures, for example, the worker's biometric information (heart rate, blood pressure, skin temperature, amount of sweat, etc.) as described above. The wearable terminal calculates the worker's activity level based on the biometric information.
 (10)エンゲージメント推定システム
 エンゲージメント推定システム1は、処理部2と、記憶部11と、通信部12と、を備える。
(10) Engagement Estimation System The engagement estimation system 1 includes a processing unit 2, a memory unit 11, and a communication unit 12.
 記憶部11は、ハードディスクドライブ(HDD)又はソリッドステートドライブ(SSD)等によって構成される記憶装置である。記憶部11は、情報を記憶する。記憶部11は、外部装置から取得した情報、例えば、生体情報計測端末3から取得した生体情報、位置計測システム4から取得した位置情報、データサーバ5から取得した関係性情報、並びに、PC8から取得した使用履歴及び回答情報等を記憶する。 The storage unit 11 is a storage device configured with a hard disk drive (HDD) or a solid state drive (SSD) or the like. The storage unit 11 stores information. The storage unit 11 stores information acquired from an external device, such as biometric information acquired from the biometric information measuring terminal 3, location information acquired from the location measuring system 4, relationship information acquired from the data server 5, and usage history and response information acquired from the PC 8.
 通信部12は、通信インタフェース装置を含んでいる。通信部12は、通信インタフェース装置を介して、外部装置(例えば、生体情報計測端末3、位置計測システム4、データサーバ5、及び、PC8)と通信可能である。本開示でいう「通信可能」とは、有線通信又は無線通信の適宜の通信方式により、直接的、又はネットワーク若しくは中継器等を介して間接的に、信号を授受できることを意味する。 The communication unit 12 includes a communication interface device. The communication unit 12 is capable of communicating with external devices (e.g., the bioinformation measurement terminal 3, the position measurement system 4, the data server 5, and the PC 8) via the communication interface device. In this disclosure, "capable of communication" means that signals can be sent and received directly or indirectly via a network or a repeater, etc., using an appropriate communication method such as wired communication or wireless communication.
 処理部2は、1以上のプロセッサ及びメモリを有するコンピュータシステムを含んでいる。コンピュータシステムのメモリに記録されたプログラムを、コンピュータシステムのプロセッサが実行することにより、処理部2の機能が実現される。プログラムは、メモリに記録されていてもよいし、インターネット等の電気通信回線を通して提供されてもよく、メモリカード等の非一時的記録媒体に記録されて提供されてもよい。 The processing unit 2 includes a computer system having one or more processors and a memory. The functions of the processing unit 2 are realized by the processor of the computer system executing a program recorded in the memory of the computer system. The program may be recorded in the memory, or may be provided via a telecommunications line such as the Internet, or may be recorded on a non-transitory recording medium such as a memory card and provided.
 処理部2は、第1取得部21と、第2取得部22と、特徴量決定部23と、回帰部24と、推定部25と、提示内容生成部26と、通信処理部27と、を有する。なお、これらは、処理部2によって実現される機能を示しているに過ぎず、必ずしも実体のある構成を示しているわけではない。 The processing unit 2 has a first acquisition unit 21, a second acquisition unit 22, a feature determination unit 23, a regression unit 24, an estimation unit 25, a presentation content generation unit 26, and a communication processing unit 27. Note that these merely indicate the functions realized by the processing unit 2, and do not necessarily indicate a concrete configuration.
 (10.1)第1取得部
 第1取得部21は、通信部12を介して、労働者の情報(複数の労働者の各々の情報)を取得する。労働者の情報は、生体情報計測端末3により計測される労働者の生体情報、位置計測システム4により計測される労働者の位置情報、及び、データサーバ5に記憶された関係性情報を含む。
(10.1) First Acquisition Unit The first acquisition unit 21 acquires worker information (information on each of a plurality of workers) via the communication unit 12. The worker information includes the worker's bioinformation measured by the bioinformation measuring terminal 3, the worker's position information measured by the position measuring system 4, and the relationship information stored in the data server 5.
 また、労働者の情報は、操作端末6で生成される申告情報を更に含む。申告情報は、仕事に関する情報であって、操作端末6に対する操作に応じて生成される。申告情報は、例えば、上述の第1回答と第2回答とのうち一方又は両方である。 The worker information further includes reporting information generated by the operation terminal 6. The reporting information is information about the work and is generated in response to operations on the operation terminal 6. The reporting information is, for example, one or both of the first and second responses described above.
 また、労働者の情報は、労働者が仕事に使用するコンピュータシステムの使用履歴を更に含む。本実施形態では、上記コンピュータシステムは、PC8である。すなわち、労働者の情報は、PC8の使用履歴を含む。 The worker information also includes the usage history of the computer system that the worker uses for work. In this embodiment, the computer system is PC8. That is, the worker information includes the usage history of PC8.
 また、労働者の情報は、労働者の気分を表す回答情報を更に含む。労働者がPC8を操作することにより、PC8で回答情報が生成される。 The worker information also includes answer information that indicates the worker's mood. The answer information is generated by the PC 8 as the worker operates the PC 8.
 また、労働者の情報は、労働者の勤怠情報を更に含む。労働者の勤怠情報は、勤怠管理システム9で生成される。 In addition, the worker information further includes the worker's attendance information. The worker's attendance information is generated by the attendance management system 9.
 また、労働者の情報は、労働者の運動指標を更に含む。労働者の運動指標は、運動計測端末10で生成される。 The worker information further includes the worker's exercise index. The worker's exercise index is generated by the exercise measurement terminal 10.
 (10.2)第2取得部
 第2取得部22は、情報処理サーバ7で推定される、第1の期間における労働者のエンゲージメントを取得する。
(10.2) Second Acquisition Unit The second acquisition unit 22 acquires the engagement of a worker in a first period estimated by the information processing server 7.
 (10.3)特徴量決定部及び回帰部
 特徴量決定部23は、労働者の情報と、第1の期間のエンゲージメントとの関係に基づいて、労働者の情報から、複数の特徴量を決定する。例えば、特徴量決定部23は、労働者の情報から、複数のパラメータを抽出する。複数のパラメータは、例えば、労働者と特定の人との会話時間、労働者の活動量、及び、労働者の残業時間等である。
(10.3) Feature Determining Unit and Regression Unit The feature determining unit 23 determines a plurality of features from the worker information based on the relationship between the worker information and the engagement in the first period. For example, the feature determining unit 23 extracts a plurality of parameters from the worker information. The plurality of parameters are, for example, the duration of a conversation between the worker and a specific person, the amount of activity of the worker, and the overtime hours of the worker.
 或るパラメータが、労働者の或る情報と一致していてもよい。つまり、労働者の情報をそのままパラメータとしてもよい。例えば、労働者の残業時間は、労働者の情報の一例である。労働者の残業時間が、パラメータとされてもよい。 A certain parameter may coincide with certain information about a worker. In other words, the information about the worker may be used as a parameter directly. For example, the overtime hours of a worker is an example of information about a worker. The overtime hours of a worker may be used as a parameter.
 あるいは、或るパラメータが、労働者の或る情報に基づいて求められてもよい。つまり、労働者の情報を加工してパラメータとしてもよい。例えば、位置計測システム4により計測される労働者の位置情報、及び、データサーバ5に記憶された関係性情報はそれぞれ、労働者の情報の一例である。これらに基づいて、労働者と特定の人との会話時間が、パラメータとして求められてもよい。より詳細には、労働者と上司との(対面での)会話時間は、例えば、労働者と上司との各々の位置情報から求めることができる。すなわち、労働者と上司との間の距離が所定距離(例えば、1メートル)以内である時間を、労働者と上司との会話時間とすることができる。また、或る労働者が上司であるか否かが、関係性情報に基づいて特定される。 Alternatively, a certain parameter may be found based on certain information about the worker. That is, the information about the worker may be processed to become the parameter. For example, the position information of the worker measured by the position measurement system 4 and the relationship information stored in the data server 5 are each an example of worker information. Based on these, the conversation time between the worker and a specific person may be found as a parameter. More specifically, the conversation time (face-to-face) between the worker and the supervisor can be found, for example, from the position information of the worker and the supervisor. That is, the time when the distance between the worker and the supervisor is within a specified distance (for example, one meter) can be determined as the conversation time between the worker and the supervisor. Also, whether or not a certain worker is a supervisor can be identified based on the relationship information.
 特徴量決定部23により、1人1人の労働者ごとに、複数の特徴量が決定されてもよい。つまり、或る労働者に対応する複数の特徴量を決定するために、当該労働者の情報と、当該労働者の第1の期間のエンゲージメントとが参照されてもよい。 The characteristic amount determining unit 23 may determine multiple characteristic amounts for each worker. In other words, to determine multiple characteristic amounts corresponding to a certain worker, information about the worker and the worker's engagement during the first period may be referenced.
 あるいは、特徴量決定部23により、2以上の労働者に共通の、複数の特徴量が決定されてもよい。つまり、2以上の労働者に対応する複数の特徴量を決定するために、当該2以上の労働者の各々の情報と、当該2以上の労働者の各々の第1の期間のエンゲージメントとが参照されてもよい。 Alternatively, the feature determination unit 23 may determine multiple features common to two or more workers. In other words, to determine multiple features corresponding to two or more workers, information on each of the two or more workers and the engagement of each of the two or more workers in the first period may be referenced.
 回帰部24は、第1の期間における複数の特徴量と、上述の第1質問に基づいて予め求められた、第1の期間における労働者のエンゲージメントと、の関係を表す回帰式を取得する。本開示において、「回帰式を取得する」処理は、エンゲージメント推定システム1の外部の構成で求められた回帰式を、上記外部の構成から回帰部24が取得する処理であってもよいし、エンゲージメント推定システム1の記憶部11に記憶された回帰式を回帰部24が取得する処理であってもよい。あるいは、「回帰式を取得する」処理は、回帰式を求める処理であってもよい。本実施形態の回帰部24は、「回帰式を取得する」処理として、第1の期間における複数の特徴量と、予め求められた、第1の期間における労働者のエンゲージメントと、に基づいて、回帰式を求める処理を行う。 The regression unit 24 acquires a regression equation that expresses the relationship between the multiple feature amounts in the first period and the worker engagement in the first period that was determined in advance based on the first question described above. In the present disclosure, the process of "acquiring a regression equation" may be a process in which the regression unit 24 acquires a regression equation determined in an external configuration of the engagement estimation system 1 from the external configuration, or a process in which the regression unit 24 acquires a regression equation stored in the memory unit 11 of the engagement estimation system 1. Alternatively, the process of "acquiring a regression equation" may be a process of determining a regression equation. In the present embodiment, the regression unit 24 performs the process of "acquiring a regression equation" by performing a process of determining a regression equation based on the multiple feature amounts in the first period and the worker engagement in the first period that was determined in advance.
 本実施形態では、以下で説明するように、特徴量決定部23が複数の特徴量を決定する処理と、回帰部24が回帰式を求める処理と、が一体的に行われる。 In this embodiment, as described below, the process in which the feature determination unit 23 determines multiple feature quantities and the process in which the regression unit 24 determines a regression equation are performed in an integrated manner.
 複数の特徴量の各々は、エンゲージメントを決定づける要素である。複数のパラメータは、労働者の情報等に基づいて得られる。複数のパラメータのうち特定のパラメータに着目したとき、上記特定のパラメータが、エンゲージメントとの相関が比較的強いこともあれば、相関が比較的弱いこともあり、相関が無いこともあり得る。 Each of the multiple feature quantities is a factor that determines engagement. The multiple parameters are obtained based on worker information, etc. When focusing on a specific parameter among the multiple parameters, the specific parameter may have a relatively strong correlation with engagement, a relatively weak correlation, or no correlation at all.
 特徴量決定部23は、複数のパラメータのうち、第1の期間のエンゲージメントとの相関が強いパラメータを、特徴量とする。特徴量決定部23は、相関の強さを、例えば、重回帰分析により算出する。すなわち、特徴量決定部23(及び回帰部24)は、第1の期間のエンゲージメントを目的変数、第1の期間の複数のパラメータを複数の説明変数として、重回帰式を求め、さらに、重回帰式の決定係数を求める。決定係数は、0から1までの値となる。決定係数が大きいほど、相関が強い。特徴量決定部23は、決定係数に基づいて、複数の特徴量を決定する。特徴量決定部23は、例えば、決定係数が閾値より大きいときの複数の説明変数を、複数の特徴量とする。なお、特徴量決定部23は、重回帰式に代えて、単回帰式を求め、単回帰式から相関の強さ(決定係数)を求めてもよい。あるいは、特徴量決定部23は、他の機械学習モデルを用いて決定係数を求めてもよい。要するに、特徴量決定部23は、回帰分析等により、複数の特徴量とエンゲージメントとの相関を求めればよい。 The feature determination unit 23 determines, among the multiple parameters, a parameter that has a strong correlation with the engagement in the first period as a feature. The feature determination unit 23 calculates the strength of the correlation, for example, by multiple regression analysis. That is, the feature determination unit 23 (and the regression unit 24) obtains a multiple regression equation using the engagement in the first period as the objective variable and the multiple parameters in the first period as multiple explanatory variables, and further obtains a coefficient of determination of the multiple regression equation. The coefficient of determination is a value between 0 and 1. The larger the coefficient of determination, the stronger the correlation. The feature determination unit 23 determines the multiple feature quantities based on the coefficient of determination. For example, the feature determination unit 23 determines the multiple explanatory variables when the coefficient of determination is greater than a threshold value as the multiple feature quantities. Note that the feature determination unit 23 may obtain a simple regression equation instead of the multiple regression equation, and obtain the strength of correlation (coefficient of determination) from the simple regression equation. Alternatively, the feature determination unit 23 may obtain the coefficient of determination using another machine learning model. In short, the feature determination unit 23 can determine the correlation between multiple features and engagement by regression analysis or the like.
 図2は、第1の期間のエンゲージメントを目的変数、第1の期間の1つのパラメータを説明変数として求めた単回帰式(直線L1)の一例を図示している。説明変数は、具体的には、2次上司(直属上司の、さらに直属の上司)との会話回数である。目的変数と説明変数とがそれぞれ、複数の時期に求められる。図2では、各時期の目的変数と説明変数との組が、点としてプロットされている。これら複数の点に基づいて、単回帰式が求められる。 Figure 2 illustrates an example of a simple regression equation (straight line L1) obtained with engagement in the first period as the dependent variable and one parameter in the first period as the explanatory variable. Specifically, the explanatory variable is the number of conversations with a second-level supervisor (the direct supervisor's further direct supervisor). The dependent variable and explanatory variable are each obtained for multiple time periods. In Figure 2, pairs of the dependent variable and explanatory variable for each time period are plotted as points. A simple regression equation is obtained based on these multiple points.
 特定の1人の労働者に対応する複数の特徴量を決定する際は、図2は、上記特定の1人の労働者に関するデータ(目的変数と説明変数との組)のプロットとなる。また、2以上の労働者に共通の、複数の特徴量を決定する際は、図2は、上記2以上の労働者の各々に関するデータ(目的変数と説明変数との組)のプロットとなる。 When determining multiple features corresponding to one specific worker, Figure 2 will be a plot of data (pairs of objective variables and explanatory variables) for that one specific worker. When determining multiple features common to two or more workers, Figure 2 will be a plot of data (pairs of objective variables and explanatory variables) for each of the two or more workers.
 また、特徴量決定部23は、次の4つのパラメータのうち、少なくとも1つのパラメータを、特徴量とする。4つのパラメータは、生体情報計測端末3により計測される労働者の生体情報に基づいたパラメータと、労働者の就業場所における、労働者の位置情報に基づいたパラメータと、職場における労働者と別の労働者との関係性に関する関係性情報に基づいたパラメータと、労働者が仕事に使用するコンピュータシステム(PC8)の使用履歴に基づいたパラメータと、である。 The feature determination unit 23 also determines, as the feature, at least one of the following four parameters: a parameter based on the worker's biometric information measured by the biometric measurement terminal 3, a parameter based on the worker's location information at the worker's workplace, a parameter based on relationship information regarding the relationship between the worker and other workers at the workplace, and a parameter based on the usage history of the computer system (PC8) used by the worker for work.
 なお、第1の期間が第1の時点と第2の時点とを含むとき、特徴量決定部23は、第1の時点のエンゲージメントと、第2の時点のエンゲージメントと、の差分に基づいて、複数の特徴量を決定してもよい。言い換えると、特徴量決定部23は、エンゲージメントの変化量に基づいて、複数の特徴量を決定してもよい。例えば、特徴量決定部23は、複数のパラメータを複数の説明変数、エンゲージメントの変化量を目的変数として重回帰式を求め、複数のパラメータのうち、エンゲージメントの変化量との相関が強いパラメータを、特徴量とすればよい。 When the first period includes a first time point and a second time point, the feature determination unit 23 may determine multiple feature amounts based on the difference between the engagement at the first time point and the engagement at the second time point. In other words, the feature determination unit 23 may determine multiple feature amounts based on the amount of change in engagement. For example, the feature determination unit 23 may determine a multiple regression equation using multiple parameters as multiple explanatory variables and the amount of change in engagement as the objective variable, and may determine, from among the multiple parameters, a parameter that has a strong correlation with the amount of change in engagement as the feature amount.
 また、特徴量決定部23は、説明変数を、所定のパラメータと基準値との差分としてもよい。基準値は、例えば、特定の期間における所定のパラメータの平均値としてもよい。例えば、所定のパラメータを、第1の期間の労働者の活動量とする。この場合、基準値を、例えば、第1の期間の前年の同時期の、上記労働者の活動量の平均値としてもよい。あるいは、基準値を、例えば、第1の期間より所定日数(例えば、6か月)前から第1の期間までの、上記労働者の活動量の平均値としてもよい。所定のパラメータと基準値との差分を取ることで、労働者ごとの基準値の違いがエンゲージメントの推定結果に影響する可能性を低減させることができる。 Furthermore, the feature determination unit 23 may set the explanatory variable as the difference between a predetermined parameter and a reference value. The reference value may be, for example, the average value of the predetermined parameter in a specific period. For example, the predetermined parameter is set to the activity level of the worker in a first period. In this case, the reference value may be, for example, the average value of the activity level of the worker in the same period of the previous year to the first period. Alternatively, the reference value may be, for example, the average value of the activity level of the worker from a predetermined number of days (for example, six months) before the first period to the first period. By taking the difference between the predetermined parameter and the reference value, it is possible to reduce the possibility that differences in the reference value for each worker will affect the engagement estimation result.
 (10.3.1)回答情報に基づいた特徴量
 複数の特徴量は、労働者の気分を表す回答情報に基づいた特徴量を少なくとも含む。
(10.3.1) Feature Amount Based on Answer Information The multiple feature amounts include at least a feature amount based on answer information that represents the mood of the worker.
 回答情報は、複数の時点(朝と夕方)の各々において複数の画像のうちいずれの画像が選択されたかに関する情報である。上述の通り、回答情報は、1、2、3又は4の数値で表される。 The answer information is information regarding which of the multiple images was selected at each of the multiple time points (morning and evening). As mentioned above, the answer information is expressed as a number: 1, 2, 3, or 4.
 各日の、朝と夕方それぞれにおける回答情報の一例を、図5に示す。6月5日の朝及び夕方、並びに、6月6日の夕方には、良い気分(good)を表す画像が選択されたため、回答情報は2である。6月6日の朝には、悪い気分(bad)を表す画像が選択されたため、回答情報は3である。 Figure 5 shows an example of response information for the morning and evening of each day. In the morning and evening of June 5th, and in the evening of June 6th, an image representing a good mood (good) was selected, so the response information is 2. In the morning of June 6th, an image representing a bad mood (bad) was selected, so the response information is 3.
 本実施形態のエンゲージメント推定方法では、第3ステップ(複数の特徴量を決定するステップ)は、複数の時点間での労働者の気分の変化量を求めるステップを含む。より詳細には、第3ステップにおいて特徴量決定部23は、各日において、朝から夕方にかけての気分の変化量を求める。変化量は、夕方における回答情報の値から、同日朝における回答情報の値を引いた値である。朝から夕方にかけて労働者の気分が変わらなかった場合は、変化量は0となる。朝から夕方にかけて労働者の気分が上向いた場合は、変化量は正の値となる。朝から夕方にかけて労働者の気分が下向いた場合は、変化量は負の値となる。 In the engagement estimation method of this embodiment, the third step (a step of determining multiple feature quantities) includes a step of determining the amount of change in the worker's mood between multiple time points. More specifically, in the third step, the feature quantity determination unit 23 determines the amount of change in mood from morning to evening for each day. The amount of change is the value obtained by subtracting the value of the answer information in the morning of the same day from the value of the answer information in the evening. If the worker's mood does not change from morning to evening, the amount of change is 0. If the worker's mood improves from morning to evening, the amount of change is a positive value. If the worker's mood deteriorates from morning to evening, the amount of change is a negative value.
 例えば、図5に示すように、6月5日の朝及び夕方の回答情報は2である。そのため、6月5日の気分の変化量は、2-2=0である(図6参照)。また、図5に示すように、6月6日の朝の回答情報は3であり、同日夕方の回答情報は2である。そのため、6月6日の気分の変化量は、2-3=-1である(図6参照)。 For example, as shown in Figure 5, the response information for the morning and evening of June 5th is 2. Therefore, the change in mood on June 5th is 2-2=0 (see Figure 6). Also, as shown in Figure 5, the response information for the morning of June 6th is 3, and the response information for the evening of the same day is 2. Therefore, the change in mood on June 6th is 2-3=-1 (see Figure 6).
 そして、本実施形態のエンゲージメント推定方法では、回答情報に基づいた特徴量は、労働者の気分の変化量のばらつきを含む。ばらつきは、一定期間(ここでは、n日間とする)における労働者の気分の変化量から算出される標準偏差σである。すなわち、或る労働者の一日の中での気分の変化量のデータがn日分得られたとき、1、2、3、……、n日目の気分の変化量をそれぞれ、X1、X2、X3、……、Xnとし、X1~Xnの平均値をXaveとすると、標準偏差σは、次式で算出される。 In the engagement estimation method of this embodiment, the feature based on the response information includes the variability in the amount of change in the worker's mood. The variability is the standard deviation σ calculated from the amount of change in the worker's mood over a certain period (here, n days). In other words, when data on the amount of change in mood of a certain worker over a period of n days is obtained, and the amount of change in mood on the 1st, 2nd, 3rd, ..., nth days are X1, X2, X3, ..., Xn, respectively, and the average value of X1 to Xn is Xave, the standard deviation σ is calculated using the following formula.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 本実施形態では、このようにして算出された標準偏差σ(ばらつき)が、回答情報に基づいた特徴量である。なお、一変形例として、特徴量としてのばらつきは、標準偏差σに代えて、分散であってもよい。 In this embodiment, the standard deviation σ (variation) calculated in this manner is a feature based on the response information. As a modified example, the variation as a feature may be variance instead of the standard deviation σ.
 ばらつきとエンゲージメントとの相関は、上述のように回帰式により表される。回帰式は、1人1人の労働者ごとに求められる。一例として、或る第1の労働者についてばらつきとエンゲージメントとの相関を表す回帰式を求めた場合に、ばらつきが大きいほどエンゲージメントが低くなる。なお、第1の労働者とは別の第2労働者についてばらつきとエンゲージメントとの相関を表す回帰式を求めた場合は、第1の労働者の場合と比較してばらつきの変化に対するエンゲージメントの変化の割合が異なったり、ばらつきが大きいほどエンゲージメントが高くなったりする可能性がある。 The correlation between variability and engagement is expressed by a regression equation as described above. A regression equation is calculated for each individual worker. As an example, when a regression equation showing the correlation between variability and engagement is calculated for a first worker, the greater the variability, the lower the engagement. Note that when a regression equation showing the correlation between variability and engagement is calculated for a second worker different from the first worker, the rate of change in engagement relative to a change in variability may be different compared to the first worker, or the greater the variability, the higher the engagement.
 (10.4)推定部
 推定部25は、第5ステップ(推定ステップ)を実行する。すなわち、推定部25は、特徴量決定部23で決定された複数の特徴量と、回帰部24で求められた回帰式と、に基づいて、第1の期間とは別の第2の期間(ここでは一例として、直近の期間とする)における労働者のエンゲージメントを推定する。例えば、労働者と2次上司との会話時間と、労働者の残業時間と、労働者の気分の変化量のばらつきと、の3つのパラメータが、特徴量決定部23により特徴量として決定されたとする。この場合、推定部25は、上記3つのパラメータの各々を説明変数、エンゲージメントを目的変数とする重回帰式(回帰部24で求めたもの)に、先月における上記3つのパラメータを代入することで、先月のパラメータを反映した直近のエンゲージメントを求める。
(10.4) Estimation Unit The estimation unit 25 executes the fifth step (estimation step). That is, the estimation unit 25 estimates the engagement of the worker in a second period (the most recent period, as an example here) different from the first period, based on the multiple feature amounts determined by the feature amount determination unit 23 and the regression equation obtained by the regression unit 24. For example, it is assumed that three parameters, namely, the conversation time between the worker and the second-level supervisor, the overtime hours of the worker, and the variance in the amount of change in the worker's mood, are determined as feature amounts by the feature amount determination unit 23. In this case, the estimation unit 25 obtains the most recent engagement reflecting the parameters of the previous month by substituting the above three parameters in the previous month into a multiple regression equation (obtained by the regression unit 24) in which each of the above three parameters is an explanatory variable and the engagement is an objective variable.
 なお、第1の期間とは別の第2の期間のうち、複数の時点においてパラメータが得られた場合は、例えば、複数の時点のパラメータの平均値を重回帰式に代入してエンゲージメントを求めてもよい。あるいは、複数の時点のパラメータをそれぞれ重回帰式に代入して複数のエンゲージメントを求めてもよい。 If parameters are obtained at multiple time points during a second period that is different from the first period, the engagement may be calculated, for example, by substituting the average value of the parameters at the multiple time points into the multiple regression equation. Alternatively, multiple engagements may be calculated by substituting the parameters at the multiple time points into the multiple regression equation.
 (10.5)提示内容生成部
 提示内容生成部26は、提示内容生成ステップを実行する。すなわち、提示内容生成部26は、推定部25による第5ステップ(推定ステップ)での推定結果を基に、閲覧者に提示する内容を生成する。閲覧者に提示する内容は、例えば、第5ステップで求められたエンゲージメントを表す数値を含む。閲覧者は、エンゲージメントを求める対象となった労働者自身であってもよいし、別の者(例えば、労働者の上司)であってもよい。
(10.5) Presentation content generation unit The presentation content generation unit 26 executes a presentation content generation step. That is, the presentation content generation unit 26 generates content to be presented to the viewer based on the estimation result in the fifth step (estimation step) by the estimation unit 25. The content to be presented to the viewer includes, for example, a numerical value representing the engagement obtained in the fifth step. The viewer may be the worker himself/herself who is the target of the engagement request, or may be another person (for example, the worker's supervisor).
 提示内容生成部26は、例えば、調査時期ごと(例えば、1か月ごと)の、複数の労働者の各々のエンゲージメントの一覧を、閲覧者に提示する内容として生成する。 The presentation content generation unit 26 generates, for example, a list of the engagement of each of a number of workers for each survey period (e.g., each month) as content to be presented to the viewer.
 また、提示内容生成部26は、例えば、調査時期ごとの、或る労働者のエンゲージメントの一覧を、閲覧者に提示する内容として生成する。 The presentation content generation unit 26 also generates, for example, a list of a worker's engagement for each survey period as content to be presented to the viewer.
 (10.6)通信処理部
 通信処理部27は、通信部12による情報の送受信を制御する。通信処理部27は、通信部12を制御することで、送信ステップを実行する。送信ステップは、提示内容生成ステップで生成された内容を端末に送信するステップである。端末は、例えば、PC8である。端末は、受信した内容を表示するディスプレイを備える。閲覧者は、端末のディスプレイを介して、提示内容生成ステップで生成された内容を閲覧する。
(10.6) Communication Processing Unit The communication processing unit 27 controls the transmission and reception of information by the communication unit 12. The communication processing unit 27 executes the transmission step by controlling the communication unit 12. The transmission step is a step of transmitting the content generated in the presentation content generation step to a terminal. The terminal is, for example, a PC 8. The terminal has a display that displays the received content. The viewer views the content generated in the presentation content generation step via the display of the terminal.
 一例として、送信ステップでは、一定の間隔で、提示内容生成ステップで生成された内容を端末に送信する。一定の間隔は、例えば、1週間、2週間、1か月、又は、2か月である。 As an example, in the transmission step, the content generated in the presentation content generation step is transmitted to the terminal at regular intervals. The regular intervals are, for example, one week, two weeks, one month, or two months.
 (11)エンゲージメント推定の具体例
 以下では、エンゲージメント推定システム1によるエンゲージメントの推定について、図3を参照して具体例を述べる。
(11) Specific Example of Engagement Estimation A specific example of engagement estimation by the engagement estimation system 1 will be described below with reference to FIG.
 労働者の生体情報、位置情報及び関係性情報等を含んだ、労働者の情報は、定期的に又は不定期に、計測及び労働者へのアンケート等によって収集される。そして、労働者の情報は、1か月ごとに集計される。例えば、図3では、1月における労働者の情報と、2月における労働者の情報と、……、がそれぞれ集計される。 Worker information, including biometric information, location information, and relationship information, is collected periodically or irregularly through measurements and questionnaires given to workers. The worker information is then compiled on a monthly basis. For example, in Figure 3, worker information for January, worker information for February, etc. are compiled.
 また、情報処理サーバ7は、申告情報に含まれる上述の第1回答に基づいて、エンゲージメントを推定する。ここでは、1か月ごとに労働者へのアンケートが実施され、アンケートへの回答として第1回答が得られる。そして、情報処理サーバ7は、第1回答に基づいて、各月のエンゲージメントを推定する。例えば、図3では、情報処理サーバ7は、1月における第1回答に基づいて、1月のエンゲージメントを推定し、2月における第1回答に基づいて、2月のエンゲージメントを推定し、3月における第1回答に基づいて、3月のエンゲージメントを推定する。 The information processing server 7 also estimates engagement based on the above-mentioned first response included in the declaration information. Here, a survey is conducted on workers once a month, and a first response is obtained as a response to the survey. Then, the information processing server 7 estimates engagement for each month based on the first response. For example, in FIG. 3, the information processing server 7 estimates engagement for January based on the first response in January, estimates engagement for February based on the first response in February, and estimates engagement for March based on the first response in March.
 情報処理サーバ7により推定されたエンゲージメントのデータがある程度蓄積されると、エンゲージメント推定システム1によるエンゲージメントの推定が可能となる。図3では、情報処理サーバ7により推定されたエンゲージメントのデータが3か月分蓄積されてから、エンゲージメント推定システム1によるエンゲージメントの推定が開始される。以下では、情報処理サーバ7により推定されたエンゲージメントのデータを、「参照エンゲージメントデータ」と呼ぶ。また、図3に示す例では、エンゲージメント推定システム1によるエンゲージメントの推定が開始される前の1~3月が、上述の第1の期間に該当する。 Once a certain amount of engagement data estimated by the information processing server 7 has been accumulated, it becomes possible for the engagement estimation system 1 to estimate engagement. In FIG. 3, after three months' worth of engagement data estimated by the information processing server 7 has been accumulated, the engagement estimation system 1 starts estimating engagement. Hereinafter, the engagement data estimated by the information processing server 7 is referred to as "reference engagement data." Also, in the example shown in FIG. 3, the first period described above corresponds to January to March before the engagement estimation system 1 starts estimating engagement.
 エンゲージメント推定システム1は、エンゲージメントを推定するために、まず、重回帰式と特徴量とを決定する。例えば、4月(第2の期間)のエンゲージメントを推定する場合には、エンゲージメント推定システム1は、4月(第2の期間)とは別の時期(第1の期間)の労働者の情報及び参照エンゲージメントデータを参照する。図3では、エンゲージメント推定システム1は、1月~3月の労働者の情報及び参照エンゲージメントデータを参照する。これにより、エンゲージメント推定システム1は、重回帰式と特徴量とを決定する。より詳細には、エンゲージメント推定システム1は、1月~3月(第1の期間)のエンゲージメントを目的変数、1月~3月(第1の期間)の労働者の情報から抽出される複数のパラメータを複数の説明変数として、重回帰式を求め、重回帰式の決定係数が閾値より大きいときの複数の説明変数を、複数の特徴量とする。 In order to estimate engagement, the engagement estimation system 1 first determines a multiple regression equation and feature quantities. For example, when estimating engagement for April (second period), the engagement estimation system 1 refers to worker information and reference engagement data for a time period (first period) other than April (second period). In FIG. 3, the engagement estimation system 1 refers to worker information and reference engagement data for January to March. In this way, the engagement estimation system 1 determines a multiple regression equation and feature quantities. In more detail, the engagement estimation system 1 determines a multiple regression equation using engagement for January to March (first period) as the objective variable and multiple parameters extracted from worker information for January to March (first period) as multiple explanatory variables, and determines the multiple explanatory variables when the coefficient of determination of the multiple regression equation is greater than a threshold value as multiple feature quantities.
 次に、エンゲージメント推定システム1は、重回帰式と特徴量とから、4月(第2の期間)のエンゲージメントを推定する。より詳細には、エンゲージメント推定システム1は、4月(第2の期間)の労働者の情報から得られる特徴量を、重回帰式に代入することで、4月(第2の期間)のエンゲージメントを求める。 Next, the engagement estimation system 1 estimates engagement for April (second period) from the multiple regression equation and the feature quantities. More specifically, the engagement estimation system 1 finds engagement for April (second period) by substituting the feature quantities obtained from the worker information for April (second period) into the multiple regression equation.
 次に、エンゲージメント推定システム1は、求めたエンゲージメントを基に、閲覧者への提示内容を生成し、提示内容を端末(PC8)に送信する。例えば、エンゲージメント推定システム1は、1か月ごとに提示内容を生成し、提示内容を端末に送信する。 Next, the engagement estimation system 1 generates content to be presented to the viewer based on the determined engagement, and transmits the content to the terminal (PC 8). For example, the engagement estimation system 1 generates content to be presented once a month, and transmits the content to the terminal.
 4月と同様にすることで、4月より後の時点のエンゲージメントを推定することが可能である。例えば、5月の労働者の情報から得られる特徴量を、重回帰式に代入することで、5月のエンゲージメントを求めることが可能である。この場合の重回帰式は、改めて求める必要はなく、4月のエンゲージメントを求める際に求めた重回帰式を使用してもよい。 By doing the same as in April, it is possible to estimate engagement after April. For example, it is possible to find engagement in May by substituting the features obtained from worker information in May into the multiple regression equation. In this case, it is not necessary to find the multiple regression equation again; the multiple regression equation found when finding engagement in April can be used.
 4月以降(第2の期間)には、第1回答を得るためのアンケート(質問)を実施することは、必須ではない。ただし、4月以降も、複数の画像の中から労働者の気分として当てはまる画像を労働者に選択させるための質問(第2質問に含まれる質問)は実施される。 After April (second period), it is not necessary to conduct the survey (questions) to obtain the first response. However, even after April, questions (questions included in the second question) will be asked to have the worker select the image that best represents the worker's mood from multiple images.
 4月以降に第1回答を得るためのアンケートを実施しない場合は、アンケートに回答する人(労働者等)がアンケートへの回答のために時間を割く必要が無くなるので、負担が軽減される。また、4月以降に第1回答を得るためのアンケートを実施するか否かにかかわらず、4月以降のエンゲージメントは、第1回答以外の情報(労働者の位置情報等)についてエンゲージメントとの相関の有無を検証した上で推定される。そのため、エンゲージメント推定システム1は、より客観的にエンゲージメントを推定することができる。 If the survey to obtain a first response is not conducted after April, the burden on those responding to the survey (workers, etc.) is reduced because they do not need to take time to answer the survey. Furthermore, regardless of whether the survey to obtain a first response is conducted after April, engagement from April onwards is estimated after verifying whether there is a correlation between information other than the first response (such as worker location information) and engagement. Therefore, engagement estimation system 1 can estimate engagement more objectively.
 重回帰式と特徴量とを決定する処理は、エンゲージメント推定システム1がエンゲージメントを推定する機会ごとに実行される必要はなく、最初の1回限り(すなわち、4月のエンゲージメントを推定する際にのみ)実行されればよい。 The process of determining the multiple regression equation and feature quantities does not need to be executed every time the engagement estimation system 1 estimates engagement, but only needs to be executed the first time (i.e., when estimating engagement in April).
 あるいは、重回帰式と特徴量とを決定(更新)する処理が、エンゲージメント推定システム1によるエンゲージメントの推定間隔(1か月)よりも長い期間が経過するごとに(例えば、6か月ごとに)実行されてもよい。この場合も、例えば、第1回答を得るためのアンケートを実施する頻度を減らすことができるので、負担が軽減されるという利点がある。 Alternatively, the process of determining (updating) the multiple regression equation and feature quantities may be executed every time a period longer than the engagement estimation interval (one month) by the engagement estimation system 1 elapses (e.g., every six months). In this case, too, there is an advantage in that the burden is reduced, for example, because the frequency of conducting surveys to obtain a first response can be reduced.
 エンゲージメント推定システム1が第2の期間(例えば、4月)のエンゲージメントを推定するために、第2の期間の労働者の情報のうち、特徴量決定部23で決定された複数の特徴量以外の情報をエンゲージメント推定システム1が収集することは、必須ではない。例えば、労働者と2次上司との会話時間と、労働者の残業時間と、労働者の気分の変化量のばらつきとが、特徴量決定部23により複数の特徴量として決定されたとする。この場合、エンゲージメント推定システム1が第2の期間の労働者の情報のうち、特徴量決定部23で決定された複数の特徴量以外の情報(例えば、労働者の活動量)を収集することは、必須ではない。 In order for the engagement estimation system 1 to estimate engagement for a second period (e.g., April), it is not essential for the engagement estimation system 1 to collect information from the worker's information for the second period other than the multiple feature quantities determined by the feature determination unit 23. For example, assume that the conversation time between the worker and the second-level supervisor, the worker's overtime hours, and the variability in the amount of change in the worker's mood are determined as multiple feature quantities by the feature determination unit 23. In this case, it is not essential for the engagement estimation system 1 to collect information from the worker's information for the second period other than the multiple feature quantities determined by the feature determination unit 23 (e.g., the worker's activity level).
 なお、4月(第2の期間)のエンゲージメントを推定するために、エンゲージメント推定システム1は、4月(第2の期間)よりも後の時期(例えば、5月~6月)の労働者の情報及び参照エンゲージメントデータを参照して、重回帰式と複数の特徴量とを決定してもよい。 In addition, to estimate engagement for April (second period), the engagement estimation system 1 may determine a multiple regression equation and multiple feature quantities by referring to worker information and reference engagement data for a period later than April (second period) (e.g., May to June).
 (12)第1ステップの具体例
 次に、第1ステップの流れの具体例を、図7を参照して説明する。第1ステップは、操作ウィンドウ820をPC8の表示部82に表示させていずれかの画像を労働者に選択させる質問ステップを、複数の時点(各日の朝と夕方)において実施するステップである。
(12) Specific Example of the First Step Next, a specific example of the flow of the first step will be described with reference to Fig. 7. The first step is a step of executing a questioning step at multiple points in time (morning and evening of each day) in which an operation window 820 is displayed on the display unit 82 of the PC 8 and the worker is prompted to select one of the images.
 まず、PC8は、現在時刻が朝又は夕方の特定の時刻であるか否かを判断する(ステップST1)。特定の時刻になると(ステップST1:Yes)、PC8は、表示部82に質問を表示する(ステップST2)。より詳細には、PC8は、複数の画像の中から労働者の気分として当てはまる画像を労働者に選択させるための質問を表す操作ウィンドウ820を表示部82に表示する。 First, the PC 8 determines whether the current time is a specific time in the morning or evening (step ST1). When the specific time arrives (step ST1: Yes), the PC 8 displays a question on the display unit 82 (step ST2). More specifically, the PC 8 displays an operation window 820 on the display unit 82, which presents a question for prompting the worker to select an image that matches the worker's mood from among multiple images.
 労働者は、いずれかの画像を選択することで、質問に回答する。質問に回答がされると(ステップST3:Yes)、処理部81は、回答時刻と回答内容とを記憶部84に記憶させる(ステップST4)。回答内容は、すなわち、回答情報としての1、2、3又は4の数値である。そして、処理部81は、回答時刻と回答内容とを、エンゲージメント推定システム1へ送信する(ステップST5)。 The worker answers the question by selecting one of the images. When the question is answered (step ST3: Yes), the processing unit 81 stores the answer time and the answer content in the memory unit 84 (step ST4). The answer content is, in other words, a numerical value of 1, 2, 3, or 4 as answer information. The processing unit 81 then transmits the answer time and the answer content to the engagement estimation system 1 (step ST5).
 また、操作ウィンドウ820を表示部82に表示してから一定時間内に回答がされなかった場合には(ステップST3:No)、処理部81は、回答を促す通知を行う(ステップST6)。回答がされるまで、処理部81は一定時間ごとに、回答を促す通知を行う。 If no reply is made within a certain period of time after the operation window 820 is displayed on the display unit 82 (step ST3: No), the processing unit 81 issues a notification prompting the reply (step ST6). Until a reply is made, the processing unit 81 issues a notification prompting the reply at regular intervals.
 図7に示す一連の処理が毎日実施されることで、各日の朝と夕方の回答情報がエンゲージメント推定システム1で取得される。 The series of processes shown in Figure 7 is performed every day, and response information for the morning and evening of each day is obtained by the engagement estimation system 1.
 なお、図7に示すフローチャートは、PC8で行われる処理の流れの一例を示しているに過ぎず、処理の順序が適宜変更されてもよいし、処理が適宜追加又は省略されてもよい。 Note that the flowchart shown in FIG. 7 merely shows one example of the flow of processing performed by PC 8, and the order of processing may be changed as appropriate, and processing may be added or omitted as appropriate.
 (13)特徴量の詳細
 特徴量は、例えば、「仕事の要求度-資源モデル」すなわち「JD-Rモデル」で規定される「仕事の資源」、「個人の資源」、又は「仕事の要求度」に関連するものであってよい。また、特徴量は、労働者が仕事のために所属する集団(企業等)が掲げるビジョン、ミッション及び理念のうち少なくとも1つに対しての、共感又は納得感に関連するものであってもよい。あるいは、特徴量は、ここで挙げたうちの2つ以上に関連するものであってもよい。
(13) Details of Feature A feature may be, for example, related to "job resources,""personalresources," or "job demands" defined in the "job demands-resources model," i.e., the "JD-R model." In addition, the feature may be related to empathy or a sense of satisfaction with at least one of the vision, mission, and philosophy of the group (company, etc.) to which the worker belongs for work. Alternatively, the feature may be related to two or more of the above.
 一例として、「仕事の資源」は、「周囲からのサポート」、「周囲との関係性」、「仕事の裁量権」、「同僚からのコーチング」、「同僚からのフィードバック」、「人間関係多様性」、及び、「キャリア開発の機会」の少なくとも1つに関するものである。 As an example, "job resources" refer to at least one of "support from those around you," "relationships with those around you," "job autonomy," "coaching from colleagues," "feedback from colleagues," "diversity of relationships," and "opportunities for career development."
 一例として、「個人の資源」は、「楽観性」、「レジリエンス」、及び、「リカバリー状況」、の少なくとも1つに関するものである。 As an example, "personal resources" may relate to at least one of "optimism," "resilience," and "recovery status."
 一例として、「仕事の要求度」は、「仕事の量的負担」、「仕事の質的負担」、及び、「仕事における身体的負担」、の少なくとも1つに関するものである。 As an example, "job demands" refers to at least one of the following: "quantitative workload," "qualitative workload," and "physical workload at work."
 上述の通り、特徴量決定部23は、例えば、労働者の情報から、複数のパラメータを抽出し、その中から複数の特徴量を決定する。パラメータ(特徴量)は、例えば、労働者と特定の人との会話時間、労働者の活動量、及び、労働者の残業時間等である。特徴量決定部23で決定される複数の特徴量は、生体情報計測端末3により計測される労働者の生体情報に基づいた特徴量と、労働者の就業場所における、労働者の位置情報に基づいた特徴量と、職場における労働者と別の労働者との関係性に関する関係性情報に基づいた特徴量と、労働者が仕事に使用するコンピュータシステム(PC8)の使用履歴に基づいた特徴量と、のうち、少なくとも1つを含む。特徴量決定部23で決定される複数の特徴量は、回答情報に基づいた特徴量を更に含む。 As described above, the feature determination unit 23 extracts multiple parameters from, for example, the worker's information and determines multiple feature amounts from them. The parameters (feature amounts) are, for example, the conversation time between the worker and a specific person, the worker's activity level, and the worker's overtime hours. The multiple feature amounts determined by the feature determination unit 23 include at least one of the following: a feature amount based on the worker's biometric information measured by the biometric information measurement terminal 3, a feature amount based on the worker's location information at the worker's workplace, a feature amount based on relationship information regarding the relationship between the worker and another worker at the workplace, and a feature amount based on the usage history of the computer system (PC 8) used by the worker for work. The multiple feature amounts determined by the feature determination unit 23 further include a feature amount based on response information.
 以下では、特徴量の例を列挙する。ただし、特徴量は以下に挙げるものに限定されない。 Below are examples of feature quantities. However, feature quantities are not limited to those listed below.
 (13.1)仕事の資源
 特徴量としての、労働者と特定の人との会話時間は、「仕事の資源」の、「周囲からのサポート」及び「周囲との関係性」に関連する。労働者と特定の人との(対面での)会話時間は、例えば、複数の労働者の各々の位置情報及び音声情報から求めることができる。すなわち、労働者と特定の人との間の距離が所定距離(例えば、1メートル)以内であり、かつ、会話の場の周囲に存在するマイクロフォンから音声情報が出力されている時間を、労働者と特定の人との会話時間とすることができる。マイクロフォンは、例えば、労働者が携帯していてもよいし、労働者の作業場所の付近に設置されていてもよい。
(13.1) Work Resources The conversation time between a worker and a specific person as a feature is related to the "work resources" of "support from the surroundings" and "relationships with the surroundings". The (face-to-face) conversation time between a worker and a specific person can be obtained, for example, from the position information and voice information of each of a plurality of workers. In other words, the conversation time between a worker and a specific person can be determined as the time when the distance between the worker and the specific person is within a predetermined distance (for example, 1 meter) and voice information is output from a microphone present around the conversation location. The microphone may be, for example, carried by the worker or installed near the worker's work location.
 また、労働者と特定の人との(対面での)会話時間は、例えば、複数の労働者の各々の位置情報のみから求めることもできる。すなわち、労働者と特定の人とが互いに近い位置にいることを、会話のためであると便宜的に判断することで、会話時間を求めることができる。例えば、労働者と特定の人との間の距離が所定距離以内である時間を、労働者と特定の人との会話時間とすることができる。 Furthermore, the duration of a (face-to-face) conversation between a worker and a specific person can also be calculated, for example, from only the location information of each of a number of workers. In other words, the duration of a conversation can be calculated by conveniently determining that the worker and a specific person are in close proximity to each other as being for the purpose of conversation. For example, the duration of a conversation between a worker and a specific person can be determined as the time when the distance between them is within a specified distance.
 また、労働者と特定の人との会話時間は、対面での会話時間に限定されず、非対面(例えば、オンライン)での会話時間を含んでいてもよい。オンラインでの会話時間は、例えば、PC8の使用履歴から抽出される。 Furthermore, the conversation time between the worker and a specific person is not limited to face-to-face conversation time, but may also include non-face-to-face (e.g., online) conversation time. Online conversation time is extracted, for example, from the usage history of PC8.
 また、労働者と特定の人との対面での会話時間と、非対面での会話時間とがそれぞれ求められてもよい。  Furthermore, the amount of time a worker has face-to-face conversations with a specific person and the amount of time he or she has not face-to-face conversations may be required separately.
 また、労働者と特定の人との会話時間は、操作端末6に入力される申告情報から抽出されてもよい。つまり、エンゲージメント推定システム1は、労働者と特定の人との会話時間を、回答者(労働者等)からの申告に基づいて取得してもよい。 Furthermore, the conversation time between the worker and the specific person may be extracted from the declaration information input to the operation terminal 6. In other words, the engagement estimation system 1 may obtain the conversation time between the worker and the specific person based on the declaration from the respondent (e.g., the worker).
 また、上述の通り、関係性情報は、複数の労働者の職場における、上記複数の労働者の互いの関係性に関する情報である。特徴量決定部23は、関係性情報に基づいて、複数の労働者の会話時間を、関係性ごとに求めてもよい。つまり、特徴量決定部23は、労働者と、特定の立場の労働者と、の会話時間を求めてもよい。例えば、特徴量決定部23は、労働者と上司との会話時間、労働者と部下との会話時間、及び、互いに同じ役職の労働者と労働者との会話時間を求めてもよい。また、関係性が更に細分化されてもよい。例えば、特徴量決定部23は、労働者と1次上司(直属上司)との会話時間と、労働者と2次上司(直属上司の、さらに直属の上司)との会話時間と、を求めてもよい。また、例えば、特徴量決定部23は、互いに同じ組織(部門等)に属する労働者と労働者との会話時間を求めたり、互いに異なる組織(部門等)に属する労働者と労働者との会話時間を求めたりしてもよい。また、例えば、特徴量決定部23は、互いに同じ業務(プロジェクト等)を担当する労働者と労働者との会話時間を求めてもよい。 As described above, the relationship information is information about the relationships between multiple workers in the workplace of the multiple workers. The feature determination unit 23 may obtain the conversation time of multiple workers for each relationship based on the relationship information. That is, the feature determination unit 23 may obtain the conversation time between a worker and a worker in a specific position. For example, the feature determination unit 23 may obtain the conversation time between a worker and a superior, the conversation time between a worker and a subordinate, and the conversation time between workers of the same position. The relationships may also be further subdivided. For example, the feature determination unit 23 may obtain the conversation time between a worker and a first-level superior (immediate superior) and the conversation time between a worker and a second-level superior (the superior of the immediate superior). For example, the feature determination unit 23 may obtain the conversation time between workers who belong to the same organization (department, etc.) and between workers who belong to different organizations (departments, etc.). Also, for example, the feature determination unit 23 may determine the conversation time between workers who are in charge of the same work (project, etc.).
 また、特徴量として、労働者と特定の人との会話回数が求められてもよい。対面での会話回数は、例えば、複数の労働者の各々の位置情報から求めることができる。すなわち、労働者と特定の人との間の距離が所定距離(例えば、1メートル)以遠から、所定距離以内となった回数を、労働者と特定の人との会話回数とすることができる。 Furthermore, the number of conversations between a worker and a specific person may be obtained as a feature. The number of face-to-face conversations can be obtained, for example, from the position information of each of a plurality of workers. In other words, the number of times the distance between the worker and a specific person changes from more than a specified distance (for example, one meter) to within the specified distance can be regarded as the number of conversations between the worker and the specific person.
 特徴量としての、労働者の残業時間、残業回数、深夜勤務時間及び休日勤務時間は、「仕事の資源」の、「仕事の裁量権」に関連する。労働者の残業時間、残業回数、深夜勤務時間及び休日勤務時間は、例えば、勤怠管理システム9から出力される勤怠情報、又は、操作端末6に入力される申告情報から抽出される。あるいは、労働者の残業時間、残業回数、深夜勤務時間及び休日勤務時間は、例えば、PC8の使用履歴から抽出された、PC8の起動開始時刻と起動終了時刻とから判断して求められる。 The feature quantities, which are the worker's overtime hours, the number of overtime hours, the number of night shift hours, and the number of holiday shift hours, are related to the "work discretion" of the "work resources." The worker's overtime hours, the number of overtime hours, the number of night shift hours, and the number of holiday shift hours are extracted, for example, from the attendance information output from the attendance management system 9, or the reported information entered into the operation terminal 6. Alternatively, the worker's overtime hours, the number of overtime hours, the number of night shift hours, and the number of holiday shift hours are determined, for example, from the start and end times of PC 8, which are extracted from the usage history of PC 8.
 特徴量としての、終業時の気分は、「仕事の資源」の、「同僚からのコーチング」及び「同僚からのフィードバック」に関連する。終業時の気分は、例えば、操作端末6に入力される申告情報から抽出される。 The mood at the end of the workday, as a feature, is related to the "work resources" of "coaching from colleagues" and "feedback from colleagues." The mood at the end of the workday is extracted, for example, from the reporting information entered into the operation terminal 6.
 特徴量としての、部門内でのコミュニケーション回数及び部門内でのコミュニケーション人数は、「仕事の資源」の、「人間関係多様性」に関連する。部門内での対面コミュニケーション回数及び部門内での対面コミュニケーション人数は、例えば、複数の労働者の各々の位置情報、又は、位置情報及び音声情報から求めることもできるし、操作端末6に入力される申告情報から抽出することもできる。部門内での非対面(例えば、オンライン)コミュニケーション回数及び部門内での非対面コミュニケーション人数は、例えば、PC8の使用履歴、又は、操作端末6に入力される申告情報から抽出できる。 The number of communications within a department and the number of people with whom communication takes place within the department, as features, are related to the "diversity of human relationships" of "work resources." The number of face-to-face communications within a department and the number of people with whom communication takes place within the department can be determined, for example, from the location information of each of multiple workers, or from location information and voice information, or can be extracted from reporting information entered into the operation terminal 6. The number of non-face-to-face (e.g., online) communications within a department and the number of people with whom communication takes place within the department can be extracted, for example, from the usage history of PC 8 or from reporting information entered into the operation terminal 6.
 特徴量としての、利用した空間の数、及び、専門ツール利用回数は、「仕事の資源」の、「キャリア開発の機会」に関連する。利用した空間の数、及び、専門ツール利用回数は、例えば、労働者の位置情報から求められ、あるいは、PC8の使用履歴、又は、操作端末6に入力される申告情報から抽出される。 The number of spaces used and the number of times specialized tools were used, as features, are related to the "opportunities for career development" of the "work resources." The number of spaces used and the number of times specialized tools were used can be obtained, for example, from the location information of the worker, or extracted from the usage history of the PC 8 or the reporting information entered into the operation terminal 6.
 (13.2)個人の資源
 特徴量としての、朝夕の気分の変化のばらつきは、「個人の資源」の、「楽観性」に関連する。朝夕の気分の変化のばらつきは、PC8に入力された回答に基づいて算出される。
(13.2) Personal Resources The variation in mood change between morning and evening as a feature quantity is related to “optimism” of “personal resources.” The variation in mood change between morning and evening is calculated based on the answers entered into the PC 8.
 特徴量としての、休憩時間、PC未入力時間、及び、PC未入力回数(所定時間以上に亘り未入力状態が継続した回数)は、「個人の資源」の、「レジリエンス」に関連する。休憩時間、PC未入力時間、及び、PC未入力回数は、例えば、PC8の使用履歴、又は、操作端末6に入力される申告情報から抽出される。 The feature quantities of break time, time when no data was entered into the PC, and the number of times when no data was entered into the PC (the number of times when no data was entered into the PC for a specified period of time or more) are related to the "resilience" of "individual resources." The break time, time when no data was entered into the PC, and the number of times when no data was entered into the PC are extracted, for example, from the usage history of the PC 8 or the reported information entered into the operation terminal 6.
 特徴量としての、オフィス内での移動量は、「個人の資源」の、「レジリエンス」に関連する。オフィス内での移動量は、例えば、労働者の位置情報から求められる、又は、労働者が携帯する歩数計(運動計測端末10)により計測される。 The amount of movement within the office, as a feature, is related to the "resilience" of "personal resources." The amount of movement within the office can be calculated, for example, from the location information of the worker, or measured by a pedometer (movement measuring terminal 10) carried by the worker.
 特徴量としての、始業時間前、終業時間後、深夜時間帯、及び、休日の、PC稼働時間及びPC稼働回数は、「個人の資源」の、「リカバリー状況」に関連する。始業時間前、終業時間後、深夜時間帯、及び、休日の、PC稼働時間及びPC稼働回数は、例えば、PC8の使用履歴、勤怠管理システム9から出力される勤怠情報、又は、操作端末6に入力される申告情報から抽出される。 The PC operation time and the number of times the PC is operated before the start of work, after the end of work, during late night hours, and on holidays, as feature quantities, relate to the "recovery status" of "personal resources." The PC operation time and the number of times the PC is operated before the start of work, after the end of work, during late night hours, and on holidays are extracted, for example, from the usage history of the PC 8, the attendance information output from the attendance management system 9, or the reporting information input to the operation terminal 6.
 特徴量としての、勤務時間のインターバル(某日の仕事の終了時間からその翌日の仕事の開始時間までの時間長)は、「個人の資源」の、「リカバリー状況」に関連する。勤務時間のインターバルは、例えば、PC8の使用履歴、勤怠管理システム9から出力される勤怠情報、又は、操作端末6に入力される申告情報から抽出される。 The work time interval (the length of time between the end of work on a certain day and the start of work on the following day) as a feature quantity is related to the "recovery status" of the "personal resources." The work time interval is extracted, for example, from the usage history of the PC 8, the attendance information output from the attendance management system 9, or the reporting information input to the operation terminal 6.
 特徴量としての、労働者の声のボリュームは、「個人の資源」の、「リカバリー状況」に関連する。労働者の声のボリュームは、例えば、生体情報計測端末3が備えるマイクロフォンにより計測される。 The volume of the worker's voice, as a feature, is related to the "recovery status" of the "personal resources." The volume of the worker's voice is measured, for example, by a microphone provided in the bio-information measurement terminal 3.
 (13.3)仕事の要求度
 特徴量としての、勤務時間(職場への入室時刻から退室時刻までの時間長)は、「仕事の要求度」の、「仕事の量的負担」に関連する。勤務時間は、例えば、労働者の位置情報、又は、勤怠管理システム9から出力される勤怠情報から求められる。
(13.3) Work Demands The feature quantity, ie, the working time (the length of time from the time of entering the workplace to the time of leaving the workplace), is related to the "quantitative burden of work" of the "work demands." The working time can be obtained, for example, from the location information of the worker or the attendance information output from the attendance management system 9.
 特徴量としての、終業時間後のPC利用時間は、「仕事の要求度」の、「仕事の量的負担」に関連する。終業時間後のPC利用時間は、例えば、PC8の使用履歴から抽出される。 The amount of time spent using a PC after work hours, as a feature, is related to the "quantitative burden of work" of the "demands of work." The amount of time spent using a PC after work hours is extracted, for example, from the usage history of PC8.
 特徴量としての、休憩場所の利用時間、及び、休憩場所の利用回数は、「仕事の要求度」の、「仕事の量的負担」に関連する。休憩場所の利用時間、及び、休憩場所の利用回数は、例えば、労働者の位置情報と、休憩場所等に関するエリア情報と、の組み合わせから求められる。エリア情報は、例えば、データサーバ5から取得される。 The feature quantities, ie, the time spent at a rest area and the number of times the rest area is used, relate to the "quantitative workload" of the "job demands." The time spent at a rest area and the number of times the rest area is used can be determined, for example, from a combination of the worker's location information and area information related to the rest area, etc. The area information is obtained, for example, from the data server 5.
 特徴量としての、始業時間から終業時間までの間でPC入力が無い時間及び回数(所定時間以上に亘り未入力状態が継続した回数)は、「仕事の要求度」の、「仕事の量的負担」に関連する。始業時間から終業時間までの間でPC入力が無い時間及び回数は、例えば、PC8の使用履歴から抽出される。 The feature quantities, ie, the time period during which there is no PC input between the start and end of work and the number of times (the number of times that no input continues for a specified period of time or more), relate to the "quantitative workload" of the "demand level of work." The time period during which there is no PC input between the start and end of work and the number of times, are extracted, for example, from the usage history of PC8.
 特徴量としての、職場での会話量は、「仕事の要求度」の、「仕事の量的負担」に関連する。職場での会話量は、例えば、生体情報計測端末3が備えるマイクロフォンの出力(音声情報)から求めることができる。 The amount of conversation in the workplace, as a feature, is related to the "demands of work" and the "quantitative burden of work." The amount of conversation in the workplace can be obtained, for example, from the output (audio information) of the microphone equipped in the bioinformation measurement terminal 3.
 特徴量としての、単位時間あたりのキーボード操作回数及び単位時間あたりのマウスカーソル移動量は、「仕事の要求度」の、「仕事の質的負担」に関連する。単位時間あたりのキーボード操作回数及び単位時間あたりのマウスカーソル移動量は、例えば、PC8の使用履歴から抽出される。 The number of keyboard operations per unit time and the amount of mouse cursor movement per unit time, which are characteristic quantities, are related to the "quality of work burden" of the "job demands." The number of keyboard operations per unit time and the amount of mouse cursor movement per unit time are extracted, for example, from the usage history of the PC 8.
 特徴量としての、PC操作時間は、「仕事の要求度」の、「仕事の質的負担」に関連する。PC操作時間は、例えば、PC8の使用履歴から抽出される。 The PC operation time, as a feature, is related to the "quality of work burden" of the "job demands." The PC operation time is extracted, for example, from the usage history of PC8.
 特徴量としての、質的負担の高い業務時間は、「仕事の要求度」の、「仕事の質的負担」に関連する。質的負担の高い業務時間は、例えば、生体情報計測端末3で計測される生体情報から求められる。具体例として、エンゲージメント推定システム1は、生体情報としての心拍数が対応する閾値より大きい状態を、質的負担の高い状態と仮定し、質的負担の高い状態の累積時間を、質的負担の高い業務時間として求める。 The task time with a high qualitative burden, as a feature, is related to the "qualitative burden of work" of the "degree of work demands." The task time with a high qualitative burden can be determined, for example, from bioinformation measured by the bioinformation measurement terminal 3. As a specific example, the engagement estimation system 1 assumes that a state in which the heart rate, as bioinformation, is higher than a corresponding threshold value is a state of high qualitative burden, and determines the accumulated time in the state of high qualitative burden as the task time with a high qualitative burden.
 特徴量としての、労働者の活動量は、「仕事の要求度」の、「仕事における身体的負担」に関連する。労働者の活動量は、例えば、METsで表される。労働者の活動量は、例えば、生体情報計測端末3で計測される生体情報(心拍数、血圧、皮膚温又は発汗量等)から求められる。 The amount of activity of a worker, as a feature, is related to the "physical burden of work" of the "job demands." The amount of activity of a worker is expressed, for example, in METs. The amount of activity of a worker can be calculated, for example, from bio-information (heart rate, blood pressure, skin temperature, amount of sweat, etc.) measured by the bio-information measuring terminal 3.
 特徴量としての、労働者の歩数は、「仕事の要求度」の、「仕事における身体的負担」に関連する。労働者の歩数は、例えば、労働者の位置情報から求められる。あるいは、労働者の歩数は、例えば、労働者が携帯する歩数計(運動計測端末10)により計測される。 The number of steps taken by a worker, as a feature, is related to the "physical burden of work" of the "job demands." The number of steps taken by a worker can be determined, for example, from the worker's location information. Alternatively, the number of steps taken by a worker can be measured, for example, by a pedometer (exercise measuring terminal 10) carried by the worker.
 特徴量としての、労働者の最大心拍数は、「仕事の要求度」の、「仕事における身体的負担」に関連する。労働者の最大心拍数は、例えば、生体情報としての心拍数の計測データから抽出される。 The maximum heart rate of a worker, as a feature, is related to the "physical burden of work" of the "job demands." The maximum heart rate of a worker is extracted, for example, from heart rate measurement data as biometric information.
 (13.4)理念共感度
 理念共感度(労働者が仕事のために所属する集団が掲げるビジョン、ミッション及び理念のうち少なくとも1つに対しての、共感又は納得感)に関連する特徴量は、例えば、操作端末6に入力される申告情報から抽出される。理念共感度に関連する特徴量は、例えば、労働者が仕事の意味をどの程度理解しているかを表す指標である。
(13.4) Degree of Sympathy with Ideals A feature related to the degree of sympathy with ideals (sympathy or a sense of agreement with at least one of the vision, mission, and ideals of the group to which the worker belongs for work) is extracted, for example, from the declaration information input to the operation terminal 6. The feature related to the degree of sympathy with ideals is, for example, an index indicating the extent to which the worker understands the meaning of the work.
 (14)変形例
 以下では、実施形態の複数の変形例について説明する。以下では、上述した実施形態の態様を、基本例と呼ぶ。
(14) Modifications Several modifications of the embodiment will be described below. Hereinafter, the above-described aspects of the embodiment will be referred to as basic examples.
 (14.1)変形例1
 本変形例では、回答情報に基づいた特徴量は、複数の時点間での労働者の気分の変化量を含む。すなわち、複数の特徴量のうちの1つの特徴量は、変化量である。
(14.1) Modification 1
In this modification, the feature based on the response information includes a change amount of the worker's mood between multiple points in time, i.e., one of the multiple feature amounts is the change amount.
 第1の期間において労働者が質問に回答することで、第2の期間における変化量が得られる。また、第1の期間において生体情報計測端末3、位置計測システム4、及び、データサーバ5等において、第1の期間における複数の特徴量のうち残りの特徴量が得られる。 The worker answers the questions during the first period, and the amount of change during the second period is obtained. Also, during the first period, the remaining feature amounts among the multiple feature amounts during the first period are obtained by the bioinformation measuring terminal 3, the position measuring system 4, the data server 5, etc.
 第2の期間において労働者が質問に回答することで、第2の期間における変化量が得られる。また、第2の期間において生体情報計測端末3、位置計測システム4、及び、データサーバ5等において、第2の期間における複数の特徴量のうち残りの特徴量が得られる。 The worker answers the questions during the second time period, thereby obtaining the amount of change during the second time period. Also, during the second time period, the remaining feature quantities among the multiple feature quantities during the second time period are obtained in the bioinformation measurement terminal 3, the position measurement system 4, the data server 5, etc.
 回帰部24は、第1の期間における複数の特徴量とエンゲージメントとの関係を表す回帰式を求める。推定部25は、第2の期間における複数の特徴量と、回帰式と、に基づいて、第2の期間における労働者のエンゲージメントを推定する。 The regression unit 24 obtains a regression equation that represents the relationship between the multiple feature amounts and engagement in the first time period. The estimation unit 25 estimates the worker's engagement in the second time period based on the multiple feature amounts in the second time period and the regression equation.
 一例として、或る第1の労働者について労働者の気分の変化量とエンゲージメントとの相関を表す回帰式を求めた場合に、変化量が大きいほどエンゲージメントが低くなる。なお、第1の労働者とは別の第2労働者について変化量とエンゲージメントとの相関を表す回帰式を求めた場合は、第1の労働者の場合と比較して変化量の変化に対するエンゲージメントの変化の割合が異なったり、変化量が大きいほどエンゲージメントが高くなったりする可能性がある。 As an example, if a regression equation is obtained for a first worker that shows the correlation between the amount of change in mood and engagement, the greater the amount of change, the lower the engagement. Note that if a regression equation is obtained for a second worker different from the first worker that shows the correlation between the amount of change and engagement, the ratio of change in engagement to the amount of change may be different compared to the first worker, or the greater the amount of change, the higher the engagement.
 基本例では、第2の期間は、1か月間である(図3参照)。これに対して、本変形例では、第2の期間は、例えば、1日であってもよい。すなわち、或る日の朝から夕方にかけての気分の変化量を特徴量として用いて、同日のエンゲージメントが推定されてもよい。また、第2の期間は、1日よりも短い期間(例えば、数時間)であってもよい。 In the basic example, the second period is one month (see FIG. 3). In contrast, in this modified example, the second period may be, for example, one day. That is, the amount of change in mood from morning to evening on a certain day may be used as a feature to estimate engagement on the same day. The second period may also be a period shorter than one day (for example, a few hours).
 また、例えば、第2の期間が6月5日と6月6日とを含む場合は、6月5日の朝から夕方にかけての気分の変化量と、6月6日の朝から夕方にかけての気分の変化量と、が得られる。このように、第2の期間において複数の変化量が得られる場合は、複数の変化量の平均値を特徴量としてもよい。 For example, if the second period includes June 5th and June 6th, the amount of change in mood from morning to evening on June 5th and the amount of change in mood from morning to evening on June 6th can be obtained. In this way, when multiple amounts of change are obtained in the second period, the average value of the multiple amounts of change can be used as the feature amount.
 (14.2)変形例2
 本変形例では、回答情報に基づいた特徴量は、複数の時点の各々における労働者の気分に対応するパラメータである。つまり、朝と夕方のそれぞれにおける回答情報としての、1、2、3又は4の数値そのものが、回答情報に基づいた特徴量である。
(14.2) Modification 2
In this modification, the feature quantity based on the response information is a parameter corresponding to the worker's mood at each of a plurality of time points. In other words, the numerical values 1, 2, 3, or 4 as the response information in the morning and evening, respectively, are themselves feature quantities based on the response information.
 また、第2の期間において複数の回答情報が得られる場合は、複数の回答情報の平均値を特徴量としてもよい。例えば、第2の期間を6月5日の朝から6月6日の夕方までの期間とすると、図5に示す例では、4つの回答情報の平均値である(2+2+3+2)/4=2.25を特徴量としてもよい。 In addition, if multiple pieces of answer information are obtained in the second period, the average value of the multiple pieces of answer information may be used as the feature amount. For example, if the second period is from the morning of June 5th to the evening of June 6th, in the example shown in FIG. 5, the average value of the four pieces of answer information, (2+2+3+2)/4=2.25, may be used as the feature amount.
 一例として、或る第1の労働者について回答情報とエンゲージメントとの相関を表す回帰式を求めた場合に、回答情報を表す数値が大きいほどエンゲージメントが低くなる。なお、第1の労働者とは別の第2労働者について回答情報を表す数値とエンゲージメントとの相関を表す回帰式を求めた場合は、第1の労働者の場合と比較して回答情報を表す数値の変化に対するエンゲージメントの変化の割合が異なったり、例えば、回答情報を表す数値が1と4との間の所定の数値(例えば、2)に近いほどエンゲージメントが高くなったりする可能性がある。 As an example, when a regression equation showing the correlation between response information and engagement is obtained for a first worker, the larger the numerical value showing the response information, the lower the engagement. Note that when a regression equation showing the correlation between the numerical value showing the response information and engagement is obtained for a second worker different from the first worker, the rate of change in engagement relative to the change in the numerical value showing the response information may be different compared to the case of the first worker, and, for example, the closer the numerical value showing the response information is to a specified numerical value between 1 and 4 (e.g., 2), the higher the engagement may be.
 基本例では、第2の期間は、1か月間である(図3参照)。これに対して、本変形例では、第2の期間は、例えば、1日であってもよいし、1日よりも短い期間(例えば、数時間)であってもよい。 In the basic example, the second period is one month (see FIG. 3). In contrast, in this modified example, the second period may be, for example, one day, or may be a period shorter than one day (for example, a few hours).
 (14.3)実施形態のその他の変形例
 以下、実施形態のその他の変形例を列挙する。以下の変形例は、適宜組み合わせて実現されてもよい。また、以下の変形例は、上述の変形例1又は2と適宜組み合わせて実現されてもよい。
(14.3) Other Modifications of the Embodiments Other modifications of the embodiment are listed below. The following modifications may be realized in appropriate combination. The following modifications may also be realized in appropriate combination with the above-mentioned modification 1 or 2.
 エンゲージメント推定システム1は、提示内容生成ステップで生成された情報を表示する表示装置を備えていてもよい。 The engagement estimation system 1 may be equipped with a display device that displays the information generated in the presentation content generation step.
 エンゲージメント推定システム1は、申告情報を生成するための操作を受け付ける操作部を備えていて、操作端末6を兼ねていてもよい。 The engagement estimation system 1 may include an operation unit that accepts operations for generating the declaration information, and may also serve as the operation terminal 6.
 エンゲージメント推定システム1は、複数の時点において上述の質問ステップを実施する労働者用端末としての、PC8等を備えていてもよい。 The engagement estimation system 1 may include a PC 8 or the like as a worker terminal that performs the above-mentioned questioning steps at multiple points in time.
 基本例では、質問ステップが1日につき2回実施される。これに対して、質問ステップが1日につき1回、又は、3回以上実施されてもよい。あるいは、質問ステップが例えば、1日おき又は数日おきに実施されてもよい。 In a basic example, the interrogation step is performed twice per day. Alternatively, the interrogation step may be performed once per day, or three or more times per day. Alternatively, the interrogation step may be performed, for example, every other day or every few days.
 基本例では、複数の時点間での労働者の気分の変化量は、1日のうちの或る時点から、同日の或る時点にかけての労働者の気分の変化量である。これに対して、複数の時点間での労働者の気分の変化量は、或る日から、別の日にかけての労働者の気分の変化量であってもよい。 In a basic example, the amount of change in the worker's mood between multiple points in time is the amount of change in the worker's mood from a point in time within a day to a point in time on the same day. In contrast, the amount of change in the worker's mood between multiple points in time may be the amount of change in the worker's mood from one day to another day.
 操作ウィンドウ820に表示される、人の表情を表す複数の画像の各々は、イラストに限定されず、例えば、写真であってもよい。 Each of the multiple images showing human facial expressions displayed in the operation window 820 is not limited to illustrations and may be, for example, photographs.
 操作ウィンドウ820に表示される複数の画像で表される気分は、very good, good, bad, very badに限定されない。当該気分は、例えば、リラックスした気分、緊張した気分、又は、疲れた気分等であってもよい。 The moods represented by the multiple images displayed in the operation window 820 are not limited to very good, good, bad, or very bad. The moods may be, for example, a relaxed mood, a tense mood, or a tired mood.
 操作ウィンドウ820に表示される複数の画像の各々は、人の表情を表す画像に限定されず、人の気分を表す画像又は記号であってもよい。人の気分を表す画像又は記号は、例えば、疲れた気分を表す汗の画像又は記号、気分が良いことを表す8分音符若しくはハートマーク等の画像又は記号、気分が良いことを表す右上がりの矢印の画像又は記号、又は、気分が沈んでいることを表す右下がりの矢印の画像又は記号であってもよい。 Each of the multiple images displayed in the operation window 820 is not limited to images showing a person's facial expression, and may be an image or symbol showing a person's mood. The image or symbol showing a person's mood may be, for example, an image or symbol of sweat showing a tired mood, an image or symbol such as an eighth note or a heart mark showing a good mood, an image or symbol of an arrow pointing up to the right showing a good mood, or an image or symbol of an arrow pointing down to the right showing a depressed mood.
 また、操作ウィンドウ820に表示される複数の画像は、互いに同一の画像であって、複数の画像のうち2以上の画像が選択可能であってもよい。例えば、操作ウィンドウ820に表示される複数の画像は、複数の星のマークであってもよい。選択された星のマークの個数が多いほど、良い気分を表す。 Furthermore, the multiple images displayed in the operation window 820 may be identical to each other, and two or more of the multiple images may be selectable. For example, the multiple images displayed in the operation window 820 may be multiple star marks. The more star marks selected, the better the mood.
 また、操作ウィンドウ820に表示される複数の画像は、複数の天気のマークであってもよい。複数の天気のマークは、例えば、晴れ、曇り、及び、雨のマークを含む。晴れのマークは、曇りのマークよりも良い気分を表す。曇りのマークは、雨のマークよりも良い気分を表す。また、複数の天気のマークは、例えば、晴れのマークと曇りのマークとを組み合わせたマークを含んでいてもよい。当該マークは、晴れのマークよりも悪く、かつ、曇りのマークよりも良い気分を表す。 The multiple images displayed in the operation window 820 may also be multiple weather marks. The multiple weather marks may include, for example, sunny, cloudy, and rainy marks. The sunny mark represents a better mood than the cloudy mark. The cloudy mark represents a better mood than the rainy mark. The multiple weather marks may also include, for example, a mark that combines a sunny mark and a cloudy mark. This mark represents a worse mood than the sunny mark, but a better mood than the cloudy mark.
 基本例では、PC8の表示部82には、労働者の4つの気分を表す4つの画像が表示される。しかしながら、表示される画像の数(すなわち、気分の数)は、2つ、3つ又は5つ以上であってもよい。 In a basic example, four images representing four moods of a worker are displayed on the display unit 82 of the PC 8. However, the number of images displayed (i.e., the number of moods) may be two, three, five or more.
 基本例では、操作ウィンドウ820において、各画像の近傍(下方)には、各画像が表す気分が文字により表示される。しかしながら、操作ウィンドウ820にこのような文字が表示されることは必須ではない。 In the basic example, the mood represented by each image is displayed in text near (below) each image in the operation window 820. However, it is not essential that such text be displayed in the operation window 820.
 PC8は、複数の画像の中から労働者の気分として当てはまる画像を労働者に選択させるための質問を表示する処理を、エンゲージメント推定システム1による制御に基づいて実行してもよい。例えば、エンゲージメント推定システム1は、所定の時間帯になると所定の指令信号をPC8に送信し、PC8は、指令信号を受信すると、表示部82に質問を表示してもよい。 The PC 8 may execute a process of displaying a question for prompting the worker to select an image that matches the worker's mood from among a plurality of images, based on control by the engagement estimation system 1. For example, the engagement estimation system 1 may transmit a predetermined command signal to the PC 8 at a predetermined time period, and the PC 8 may display a question on the display unit 82 upon receiving the command signal.
 回答情報に基づいた特徴量は、基本例では労働者の気分の変化量のばらつき、変形例1では労働者の気分の変化量、変形例2では回答情報そのものである。これらのうち2つ又は3つが、回答情報に基づいた特徴量としてエンゲージメントの推定に使用されてもよい。 The feature based on the response information is the variability in the amount of change in the worker's mood in the basic example, the amount of change in the worker's mood in variant 1, and the response information itself in variant 2. Two or three of these may be used as feature based on the response information to estimate engagement.
 特徴量決定部23は、複数のパラメータと第1の期間のエンゲージメントとの相関の強さを求めるために、決定係数に代えて、相関係数を参照してもよい。 The feature determination unit 23 may refer to a correlation coefficient instead of the coefficient of determination to determine the strength of correlation between multiple parameters and engagement in the first period.
 本開示におけるエンゲージメント推定システム1又はエンゲージメント推定方法の実行主体は、コンピュータシステムを含んでいる。コンピュータシステムは、ハードウェアとしてのプロセッサ及びメモリを主構成とする。コンピュータシステムのメモリに記録されたプログラムをプロセッサが実行することによって、本開示におけるエンゲージメント推定システム1又はエンゲージメント推定方法の実行主体としての機能の少なくとも一部が実現される。プログラムは、コンピュータシステムのメモリに予め記録されてもよく、電気通信回線を通じて提供されてもよく、コンピュータシステムで読み取り可能なメモリカード、光学ディスク、ハードディスクドライブ等の非一時的記録媒体に記録されて提供されてもよい。コンピュータシステムのプロセッサは、半導体集積回路(IC)又は大規模集積回路(LSI)を含む1ないし複数の電子回路で構成される。ここでいうIC又はLSI等の集積回路は、集積の度合いによって呼び方が異なっており、システムLSI、VLSI(Very Large Scale Integration)、又はULSI(Ultra Large Scale Integration)と呼ばれる集積回路を含む。さらに、LSIの製造後にプログラムされる、FPGA(Field-Programmable Gate Array)、又はLSI内部の接合関係の再構成若しくはLSI内部の回路区画の再構成が可能な論理デバイスについても、プロセッサとして採用することができる。複数の電子回路は、1つのチップに集約されていてもよいし、複数のチップに分散して設けられていてもよい。複数のチップは、1つの装置に集約されていてもよいし、複数の装置に分散して設けられていてもよい。ここでいうコンピュータシステムは、1以上のプロセッサ及び1以上のメモリを有するマイクロコントローラを含む。したがって、マイクロコントローラについても、半導体集積回路又は大規模集積回路を含む1ないし複数の電子回路で構成される。 The entity that executes the engagement estimation system 1 or the engagement estimation method in the present disclosure includes a computer system. The computer system is mainly composed of a processor and a memory as hardware. At least a part of the functions of the entity that executes the engagement estimation system 1 or the engagement estimation method in the present disclosure is realized by the processor executing a program recorded in the memory of the computer system. The program may be pre-recorded in the memory of the computer system, may be provided through a telecommunications line, or may be provided by being recorded on a non-transitory recording medium such as a memory card, an optical disk, or a hard disk drive that is readable by the computer system. The processor of the computer system is composed of one or more electronic circuits including a semiconductor integrated circuit (IC) or a large scale integrated circuit (LSI). The integrated circuits such as ICs or LSIs referred to here are called different names depending on the degree of integration, and include integrated circuits called system LSIs, VLSIs (Very Large Scale Integration), or ULSIs (Ultra Large Scale Integration). Furthermore, a field-programmable gate array (FPGA) that is programmed after the LSI is manufactured, or a logic device that allows the reconfiguration of the connection relationships within the LSI or the reconfiguration of the circuit partitions within the LSI, can also be used as a processor. Multiple electronic circuits may be integrated into one chip, or may be distributed across multiple chips. Multiple chips may be integrated into one device, or may be distributed across multiple devices. The computer system referred to here includes a microcontroller having one or more processors and one or more memories. Thus, the microcontroller is also composed of one or more electronic circuits including a semiconductor integrated circuit or a large-scale integrated circuit.
 また、エンゲージメント推定システム1における複数の機能が、1つの装置に集約されていることはエンゲージメント推定システム1に必須の構成ではなく、エンゲージメント推定システム1の複数の構成要素は、複数の装置に分散して設けられていてもよい。さらに、エンゲージメント推定システム1の少なくとも一部の機能がサーバ又はクラウド(クラウドコンピューティング)等によって実現されてもよい。 Furthermore, it is not essential for the engagement estimation system 1 that multiple functions are consolidated into one device, and multiple components of the engagement estimation system 1 may be distributed across multiple devices. Furthermore, at least some of the functions of the engagement estimation system 1 may be realized by a server or a cloud (cloud computing), etc.
 反対に、基本例において、複数の装置に分散されている複数の機能が、1つの装置に集約されていてもよい。例えば、データサーバ5と、情報処理サーバ7と、エンゲージメント推定システム1と、のうち少なくとも2つが、1つの装置に集約されていてもよい。また、例えば、PC8が操作端末6を兼ねていてもよい。また、例えば、生体情報計測端末3が運動計測端末10を兼ねていてもよい。 On the other hand, in the basic example, multiple functions that are distributed across multiple devices may be consolidated into one device. For example, at least two of the data server 5, the information processing server 7, and the engagement estimation system 1 may be consolidated into one device. Also, for example, the PC 8 may also function as the operation terminal 6. Also, for example, the bio-information measuring terminal 3 may also function as the exercise measuring terminal 10.
 エンゲージメント推定システム1の少なくとも一部の機能が、機械学習により生成された演算モデルにより実現されてもよい。例えば、複数の特徴量を決定する第3ステップが、上記演算モデルにより実現されてもよい。 At least a portion of the functions of the engagement estimation system 1 may be realized by a computational model generated by machine learning. For example, the third step of determining a plurality of feature quantities may be realized by the computational model.
 (まとめ)
 以上説明した実施形態等から、以下の態様が開示されている。
(summary)
The above-described embodiments and the like disclose the following aspects.
 第1の態様に係るエンゲージメント推定方法は、エンゲージメント推定システム(1)により実行され、仕事に対する労働者のエンゲージメントを推定する方法である。エンゲージメント推定方法は、第1ステップと、第2ステップと、第3ステップと、第4ステップと、第5ステップと、を有する。第1ステップでは、質問ステップを、複数の時点において実施する。質問ステップでは、複数の気分と一対一で対応した複数の画像を表示し、複数の画像の中から労働者の気分として当てはまる画像を労働者に選択させる。第2ステップでは、複数の時点の各々において複数の画像のうちいずれの画像が選択されたかに関する回答情報を記憶する。第3ステップでは、複数の特徴量を決定する。第4ステップでは、第1の期間における複数の特徴量と、予め求められた、第1の期間における労働者のエンゲージメントと、の関係を表す回帰式を取得する。第5ステップでは、第1の期間とは別の第2の期間における複数の特徴量と、回帰式と、に基づいて、第2の期間における労働者のエンゲージメントを推定する。複数の特徴量は、生体情報計測端末(3)により計測される労働者の生体情報に基づいた特徴量と、労働者の就業場所における、労働者の位置情報に基づいた特徴量と、職場における労働者と別の労働者との関係性に関する関係性情報に基づいた特徴量と、労働者が仕事に使用するコンピュータシステム(PC8)の使用履歴に基づいた特徴量と、のうち、少なくとも1つを含む。複数の特徴量は、回答情報に基づいた特徴量を更に含む。 The engagement estimation method according to the first aspect is a method for estimating a worker's engagement with work, which is executed by an engagement estimation system (1). The engagement estimation method has a first step, a second step, a third step, a fourth step, and a fifth step. In the first step, a questioning step is carried out at multiple time points. In the questioning step, multiple images corresponding one-to-one to multiple moods are displayed, and the worker is asked to select an image that corresponds to the worker's mood from the multiple images. In the second step, answer information regarding which image was selected from the multiple images at each of the multiple time points is stored. In the third step, multiple feature amounts are determined. In the fourth step, a regression equation is obtained that represents the relationship between the multiple feature amounts in the first period and the worker's engagement in the first period that is determined in advance. In the fifth step, the worker's engagement in the second period is estimated based on the multiple feature amounts in a second period different from the first period and the regression equation. The multiple feature amounts include at least one of the following: a feature amount based on the worker's biometric information measured by the biometric information measuring terminal (3), a feature amount based on the worker's location information at the worker's workplace, a feature amount based on relationship information regarding the relationship between the worker and another worker at the workplace, and a feature amount based on the usage history of a computer system (PC8) used by the worker for work. The multiple feature amounts further include a feature amount based on response information.
 上記の構成によれば、複数の画像の中から労働者の気分として当てはまる画像を労働者が選択することで、エンゲージメント推定システム(1)によりエンゲージメントが推定される。労働者は、現在の自分の気分に当てはまる画像を直感的に選択すればよい。そのため、回答に要する労働者の負担が軽減される。例えば、質問文が表示されて、その質問文を労働者が読んだ上で、労働者が回答を行う場合と比較して、労働者の負担が軽減される。 In the above configuration, the worker selects an image that matches the worker's mood from among multiple images, and the engagement estimation system (1) estimates the worker's engagement. The worker can intuitively select an image that matches his or her current mood. This reduces the burden on the worker in answering questions. For example, the burden on the worker is reduced compared to when a question is displayed, the worker reads the question, and then the worker answers the question.
 また、第2の態様に係るエンゲージメント推定方法は、第1の態様において、第4ステップでは、第1の期間における複数の特徴量と、予め求められた、第1の期間における労働者のエンゲージメントと、に基づいて、回帰式を求める。 In addition, in the engagement estimation method according to the second aspect, in the first aspect, in the fourth step, a regression equation is calculated based on a plurality of feature quantities in the first period and the engagement of the worker in the first period that has been calculated in advance.
 上記の構成によれば、回帰式を求める処理と、第2の期間における労働者のエンゲージメントを推定する処理と、をエンゲージメント推定システム(1)に集約できる。 With the above configuration, the process of calculating the regression equation and the process of estimating worker engagement in the second time period can be consolidated into the engagement estimation system (1).
 また、第3の態様に係るエンゲージメント推定方法は、第1又は2の態様において、第5ステップでの推定結果を基に、閲覧者に提示する内容を生成する提示内容生成ステップを更に有する。 In addition, the engagement estimation method according to the third aspect, in the first or second aspect, further includes a presentation content generation step of generating content to be presented to the viewer based on the estimation result in the fifth step.
 上記の構成によれば、閲覧者が推定結果を知ることができる。 The above configuration allows the viewer to know the estimation results.
 また、第4の態様に係るエンゲージメント推定方法では、第3の態様において、提示内容生成ステップで生成された内容を端末に送信する送信ステップを更に有する。 In addition, the engagement estimation method according to the fourth aspect, in the third aspect, further includes a transmission step of transmitting the content generated in the presentation content generation step to the terminal.
 上記の構成によれば、閲覧者が推定結果を知ることができる。 The above configuration allows the viewer to know the estimation results.
 また、第5の態様に係るエンゲージメント推定方法では、第1~4の態様のいずれか1つにおいて、第1ステップでは、質問ステップを、1日のうちの複数の時間帯に実施する。 In addition, in the engagement estimation method according to the fifth aspect, in any one of the first to fourth aspects, in the first step, the questioning step is carried out at multiple time periods in one day.
 上記の構成によれば、回答情報は、1日における労働者の気分の変化に関する情報を含むことになる。これにより、エンゲージメント推定システム(1)は、エンゲージメントを効果的に推定できる。 With the above configuration, the response information includes information regarding changes in the worker's mood throughout the day. This allows the engagement estimation system (1) to effectively estimate engagement.
 また、第6の態様に係るエンゲージメント推定方法では、第1~5の態様のいずれか1つにおいて、第3ステップは、複数の時点間での労働者の気分の変化量を求めるステップを含む。回答情報に基づいた特徴量は、変化量のばらつきを含む。 In addition, in the engagement estimation method according to the sixth aspect, in any one of the first to fifth aspects, the third step includes a step of determining the amount of change in the worker's mood between multiple points in time. The feature based on the response information includes the variance in the amount of change.
 上記の構成によれば、エンゲージメント推定システム(1)は、エンゲージメントを効果的に推定できる。  With the above configuration, the engagement estimation system (1) can effectively estimate engagement.
 また、第7の態様に係るエンゲージメント推定方法では、第1~6の態様のいずれか1つにおいて、第3ステップは、複数の時点間での労働者の気分の変化量を求めるステップを含む。回答情報に基づいた特徴量は、変化量を含む。 In addition, in the engagement estimation method according to the seventh aspect, in any one of the first to sixth aspects, the third step includes a step of determining the amount of change in the worker's mood between multiple points in time. The feature based on the response information includes the amount of change.
 上記の構成によれば、エンゲージメント推定システム(1)は、エンゲージメントを効果的に推定できる。  With the above configuration, the engagement estimation system (1) can effectively estimate engagement.
 また、第8の態様に係るエンゲージメント推定方法では、第1~7の態様のいずれか1つにおいて、回答情報に基づいた特徴量は、複数の時点の各々における労働者の気分に対応するパラメータである。 In addition, in the engagement estimation method according to the eighth aspect, in any one of the first to seventh aspects, the feature based on the response information is a parameter corresponding to the worker's mood at each of a plurality of points in time.
 上記の構成によれば、エンゲージメント推定システム(1)は、エンゲージメントを効果的に推定できる。  With the above configuration, the engagement estimation system (1) can effectively estimate engagement.
 また、第9の態様に係るエンゲージメント推定方法は、第1~8の態様のいずれか1つにおいて、第1ステップにおいて労働者の気分として当てはまる画像が選択されなかった場合に、選択を促す通知を行う通知ステップを更に有する。 In addition, the engagement estimation method according to the ninth aspect, in any one of the first to eighth aspects, further includes a notification step of issuing a notification to prompt a selection if an image that matches the worker's mood is not selected in the first step.
 上記の構成によれば、労働者が回答し忘れてエンゲージメント推定システム(1)においてエンゲージメントを推定できなくなる可能性を低減できる。 The above configuration reduces the possibility that a worker will forget to respond, which makes it impossible for the engagement estimation system (1) to estimate their engagement.
 また、第10の態様に係るエンゲージメント推定方法では、第1~9の態様のいずれか1つにおいて、第1ステップにおいて表示される複数の画像の各々は、人の表情を示す画像である。 In addition, in the engagement estimation method according to the tenth aspect, in any one of the first to ninth aspects, each of the multiple images displayed in the first step is an image showing a person's facial expression.
 上記の構成によれば、労働者が気分に当てはまる画像を直感的に選択しやすい。 The above configuration makes it easy for workers to intuitively select an image that matches their mood.
 第1の態様以外の構成については、エンゲージメント推定方法に必須の構成ではなく、適宜省略可能である。 Configurations other than the first aspect are not essential to the engagement estimation method and may be omitted as appropriate.
 また、第11の態様に係るプログラムは、第1~10の態様のいずれか1つに係るエンゲージメント推定方法を、(エンゲージメント推定システム1の)コンピュータシステムの1以上のプロセッサに実行させるためのプログラムである。 The program according to the eleventh aspect is a program for causing one or more processors of a computer system (of the engagement estimation system 1) to execute the engagement estimation method according to any one of the first to tenth aspects.
 上記の構成によれば、回答に要する労働者の負担が軽減される。 The above configuration reduces the burden on workers when responding.
 また、第12の態様に係るエンゲージメント推定システム(1)は、仕事に対する労働者のエンゲージメントを推定する。エンゲージメント推定システム(1)は、取得部(21)と、記憶部(11)と、特徴量決定部(23)と、回帰部(24)と、推定部(25)と、を備える。取得部(21)は、労働者用端末(PC8)から、回答情報を取得する。労働者端末は、質問ステップを、複数の時点において実施する。質問ステップでは、複数の気分と一対一で対応した複数の画像を表示し、複数の画像の中から労働者の気分として当てはまる画像を労働者に選択させる。回答情報は、複数の時点の各々において複数の画像のうちいずれの画像が選択されたかに関する情報である。記憶部(11)は、回答情報を記憶する。特徴量決定部(23)は、複数の特徴量を決定する。回帰部(24)は、第1の期間における複数の特徴量と、予め求められた、第1の期間における労働者のエンゲージメントと、の関係を表す回帰式を取得する。推定部(25)は、第1の期間とは別の第2の期間における複数の特徴量と、回帰式と、に基づいて、第2の期間における労働者のエンゲージメントを推定する。複数の特徴量は、生体情報計測端末(3)により計測される労働者の生体情報に基づいた特徴量と、労働者の就業場所における、労働者の位置情報に基づいた特徴量と、職場における労働者と別の労働者との関係性に関する関係性情報に基づいた特徴量と、労働者が仕事に使用するコンピュータシステム(PC8)の使用履歴に基づいた特徴量と、のうち、少なくとも1つを含む。複数の特徴量は、回答情報に基づいた特徴量を更に含む。 Furthermore, an engagement estimation system (1) relating to a twelfth aspect estimates a worker's engagement with work. The engagement estimation system (1) includes an acquisition unit (21), a memory unit (11), a feature determination unit (23), a regression unit (24), and an estimation unit (25). The acquisition unit (21) acquires answer information from a worker terminal (PC8). The worker terminal performs a questioning step at multiple time points. In the questioning step, multiple images corresponding one-to-one to multiple moods are displayed, and the worker is prompted to select an image from the multiple images that matches the worker's mood. The answer information is information regarding which image was selected from the multiple images at each of the multiple time points. The memory unit (11) stores the answer information. The feature determination unit (23) determines multiple feature amounts. The regression unit (24) acquires a regression equation expressing the relationship between the multiple feature amounts in the first period and the worker's engagement in the first period, which has been determined in advance. The estimation unit (25) estimates the worker's engagement in the second period based on the multiple feature amounts in the second period different from the first period and the regression equation. The multiple feature amounts include at least one of a feature amount based on the worker's biometric information measured by the biometric information measurement terminal (3), a feature amount based on the worker's position information in the worker's workplace, a feature amount based on relationship information regarding the relationship between the worker and another worker in the workplace, and a feature amount based on the usage history of a computer system (PC8) used by the worker for work. The multiple feature amounts further include a feature amount based on response information.
 上記の構成によれば、回答に要する労働者の負担が軽減される。 The above configuration reduces the burden on workers when responding.
 また、第13の態様に係るエンゲージメント推定システム(1)は、第12の態様において、特徴量決定部(23)は、複数の時点間での労働者の気分の変化量を求める。回答情報に基づいた特徴量は、変化量のばらつきを含む。 In addition, in the engagement estimation system (1) according to the thirteenth aspect, in the twelfth aspect, the feature determination unit (23) determines the amount of change in the worker's mood between multiple points in time. The feature based on the response information includes the variability in the amount of change.
 上記の構成によれば、エンゲージメント推定システム(1)は、エンゲージメントを効果的に推定できる。  With the above configuration, the engagement estimation system (1) can effectively estimate engagement.
 第12の態様以外の構成については、エンゲージメント推定システム(1)に必須の構成ではなく、適宜省略可能である。 Configurations other than the twelfth aspect are not essential to the engagement estimation system (1) and may be omitted as appropriate.
 上記態様に限らず、実施形態に係るエンゲージメント推定システム(1)の種々の構成(変形例を含む)は、エンゲージメント推定方法、(コンピュータ)プログラム、又はプログラムを記録した非一時的記録媒体にて具現化可能である。 Not limited to the above aspects, various configurations (including modified examples) of the engagement estimation system (1) according to the embodiment can be embodied in an engagement estimation method, a (computer) program, or a non-transitory recording medium having a program recorded thereon.
1 エンゲージメント推定システム
3 生体情報計測端末
8 PC(労働者用端末、コンピュータシステム)
11 記憶部
21 取得部
23 特徴量決定部
24 回帰部
25 推定部
1 Engagement estimation system 3 Biometric information measurement terminal 8 PC (worker terminal, computer system)
11 Storage unit 21 Acquisition unit 23 Feature amount determination unit 24 Regression unit 25 Estimation unit

Claims (13)

  1.  エンゲージメント推定システムにより実行され、仕事に対する労働者のエンゲージメントを推定するエンゲージメント推定方法であって、
     複数の気分と一対一で対応した複数の画像を表示し、前記複数の画像の中から前記労働者の気分として当てはまる画像を前記労働者に選択させる質問ステップを、複数の時点において実施する第1ステップと、
     前記複数の時点の各々において前記複数の画像のうちいずれの画像が選択されたかに関する回答情報を記憶する第2ステップと、
     複数の特徴量を決定する第3ステップと、
     第1の期間における前記複数の特徴量と、予め求められた、前記第1の期間における前記労働者の前記エンゲージメントと、の関係を表す回帰式を取得する第4ステップと、
     前記第1の期間とは別の第2の期間における前記複数の特徴量と、前記回帰式と、に基づいて、前記第2の期間における前記労働者の前記エンゲージメントを推定する第5ステップと、を有し、
     前記複数の特徴量は、
      生体情報計測端末により計測される前記労働者の生体情報に基づいた特徴量と、
      前記労働者の就業場所における、前記労働者の位置情報に基づいた特徴量と、
      職場における前記労働者と別の労働者との関係性に関する関係性情報に基づいた特徴量と、
      前記労働者が前記仕事に使用するコンピュータシステムの使用履歴に基づいた特徴量と、のうち、少なくとも1つを含み、
     前記複数の特徴量は、前記回答情報に基づいた特徴量を更に含む、
     エンゲージメント推定方法。
    An engagement estimation method for estimating a worker's engagement in a job, the method being executed by an engagement estimation system, the method comprising:
    A first step of displaying a plurality of images corresponding one-to-one to a plurality of moods and asking the worker to select an image that corresponds to the worker's mood from the plurality of images at a plurality of time points;
    a second step of storing answer information regarding which image of the plurality of images was selected at each of the plurality of time points;
    a third step of determining a plurality of features;
    A fourth step of acquiring a regression equation expressing a relationship between the plurality of feature amounts in a first period and the engagement of the worker in the first period, the relationship being determined in advance;
    and a fifth step of estimating the engagement of the worker during a second period based on the plurality of feature amounts during the second period and the regression equation, the fifth step comprising:
    The plurality of feature amounts are
    A feature based on the worker's biometric information measured by a biometric information measuring terminal;
    A feature based on location information of the worker at the workplace of the worker;
    A feature based on relationship information regarding a relationship between the worker and another worker in the workplace;
    and a feature based on a usage history of a computer system used by the worker for the job;
    the plurality of feature amounts further includes a feature amount based on the answer information,
    Engagement Estimation Methodology.
  2.  前記第4ステップでは、前記第1の期間における前記複数の特徴量と、予め求められた、前記第1の期間における前記労働者の前記エンゲージメントと、に基づいて、前記回帰式を求める、
     請求項1に記載のエンゲージメント推定方法。
    In the fourth step, the regression equation is obtained based on the plurality of feature amounts in the first period and the engagement of the worker in the first period that is obtained in advance.
    The method for estimating engagement according to claim 1 .
  3.  前記第5ステップでの推定結果を基に、閲覧者に提示する内容を生成する提示内容生成ステップを更に有する、
     請求項1に記載のエンゲージメント推定方法。
    The method further includes a presentation content generating step of generating content to be presented to a viewer based on the estimation result in the fifth step.
    The method for estimating engagement according to claim 1 .
  4.  前記提示内容生成ステップで生成された前記内容を端末に送信する送信ステップを更に有する、
     請求項3に記載のエンゲージメント推定方法。
    The method further includes a transmission step of transmitting the content generated in the presentation content generation step to a terminal.
    The method for estimating engagement according to claim 3 .
  5.  前記第1ステップでは、前記質問ステップを、1日のうちの複数の時間帯に実施する、
     請求項1に記載のエンゲージメント推定方法。
    In the first step, the questioning step is carried out at a plurality of time periods in a day.
    The method for estimating engagement according to claim 1 .
  6.  前記第3ステップは、前記複数の時点間での前記労働者の前記気分の変化量を求めるステップを含み、
     前記回答情報に基づいた特徴量は、前記変化量のばらつきを含む、
     請求項1に記載のエンゲージメント推定方法。
    The third step includes a step of determining a change in the mood of the worker between the plurality of time points,
    the feature based on the response information includes a variance in the amount of change;
    The method for estimating engagement according to claim 1 .
  7.  前記第3ステップは、前記複数の時点間での前記労働者の前記気分の変化量を求めるステップを含み、
     前記回答情報に基づいた特徴量は、前記変化量を含む、
     請求項1に記載のエンゲージメント推定方法。
    The third step includes a step of determining a change in the mood of the worker between the plurality of time points,
    the feature based on the answer information includes the amount of change;
    The method for estimating engagement according to claim 1 .
  8.  前記回答情報に基づいた特徴量は、前記複数の時点の各々における前記労働者の前記気分に対応するパラメータである、
     請求項1に記載のエンゲージメント推定方法。
    The feature based on the response information is a parameter corresponding to the mood of the worker at each of the plurality of time points.
    The method for estimating engagement according to claim 1 .
  9.  前記第1ステップにおいて前記労働者の前記気分として当てはまる前記画像が選択されなかった場合に、選択を促す通知を行う通知ステップを更に有する、
     請求項1に記載のエンゲージメント推定方法。
    The method further includes a notification step of issuing a notification to prompt a selection when the image corresponding to the mood of the worker is not selected in the first step.
    The method for estimating engagement according to claim 1 .
  10.  前記第1ステップにおいて表示される前記複数の画像の各々は、人の表情を示す画像である、
     請求項1に記載のエンゲージメント推定方法。
    Each of the plurality of images displayed in the first step is an image showing a human facial expression.
    The method for estimating engagement according to claim 1 .
  11.  請求項1に記載のエンゲージメント推定方法を、コンピュータシステムの1以上のプロセッサに実行させるための、
     プログラム。
    2. A method for estimating engagement according to claim 1, comprising:
    program.
  12.  仕事に対する労働者のエンゲージメントを推定するエンゲージメント推定システムであって、
     複数の気分と一対一で対応した複数の画像を表示し、前記複数の画像の中から前記労働者の気分として当てはまる画像を前記労働者に選択させる質問ステップを、複数の時点において実施する労働者用端末から、前記複数の時点の各々において前記複数の画像のうちいずれの画像が選択されたかに関する回答情報を取得する取得部と、
     前記回答情報を記憶する記憶部と、
     複数の特徴量を決定する特徴量決定部と、
     第1の期間における前記複数の特徴量と、予め求められた、前記第1の期間における前記労働者の前記エンゲージメントと、の関係を表す回帰式を取得する回帰部と、
     前記第1の期間とは別の第2の期間における前記複数の特徴量と、前記回帰式と、に基づいて、前記第2の期間における前記労働者の前記エンゲージメントを推定する推定部と、を備え、
     前記複数の特徴量は、
      生体情報計測端末により計測される前記労働者の生体情報に基づいた特徴量と、
      前記労働者の就業場所における、前記労働者の位置情報に基づいた特徴量と、
      職場における前記労働者と別の労働者との関係性に関する関係性情報に基づいた特徴量と、
      前記労働者が前記仕事に使用するコンピュータシステムの使用履歴に基づいた特徴量と、のうち、少なくとも1つを含み、
     前記複数の特徴量は、前記回答情報に基づいた特徴量を更に含む、
     エンゲージメント推定システム。
    An engagement estimation system for estimating a worker's engagement with work, comprising:
    an acquisition unit that acquires answer information from a worker terminal that performs a questioning step at multiple time points, the questioning step displaying multiple images corresponding one-to-one to multiple moods and having the worker select an image that corresponds to the worker's mood from the multiple images, the acquisition unit acquiring answer information regarding which image was selected from the multiple images at each of the multiple time points;
    A storage unit that stores the answer information;
    a feature determining unit for determining a plurality of feature amounts;
    A regression unit that acquires a regression equation that expresses a relationship between the plurality of feature amounts in a first time period and the engagement of the worker in the first time period that is obtained in advance;
    an estimation unit that estimates the engagement of the worker during a second period based on the plurality of feature amounts during the second period and the regression equation,
    The plurality of feature amounts are
    A feature based on the worker's biometric information measured by a biometric information measuring terminal;
    A feature based on location information of the worker at the workplace of the worker;
    A feature based on relationship information regarding a relationship between the worker and another worker in the workplace;
    and a feature based on a usage history of a computer system used by the worker for the job;
    the plurality of feature amounts further includes a feature amount based on the answer information,
    Engagement Estimation System.
  13.  前記特徴量決定部は、前記複数の時点間での前記労働者の前記気分の変化量を求め、
     前記回答情報に基づいた特徴量は、前記変化量のばらつきを含む、
     請求項12に記載のエンゲージメント推定システム。
    The feature determining unit determines an amount of change in the mood of the worker between the plurality of time points;
    the feature based on the response information includes a variance in the amount of change;
    The engagement estimation system of claim 12 .
PCT/JP2024/014490 2023-04-19 2024-04-10 Engagement estimation method, program, and engagement estimation system WO2024219300A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2023068721 2023-04-19
JP2023-068721 2023-04-19

Publications (1)

Publication Number Publication Date
WO2024219300A1 true WO2024219300A1 (en) 2024-10-24

Family

ID=93152473

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2024/014490 WO2024219300A1 (en) 2023-04-19 2024-04-10 Engagement estimation method, program, and engagement estimation system

Country Status (1)

Country Link
WO (1) WO2024219300A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018102617A (en) * 2016-12-27 2018-07-05 オムロン株式会社 Emotion estimation apparatus, method, and program
JP2020154938A (en) * 2019-03-22 2020-09-24 Kddi株式会社 Device, program, and method for estimating satisfaction level based on action information
JP2021026550A (en) * 2019-08-06 2021-02-22 Fringe81株式会社 Server and system for managing post
WO2022224661A1 (en) * 2021-04-22 2022-10-27 株式会社Nttドコモ Ability assessment device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018102617A (en) * 2016-12-27 2018-07-05 オムロン株式会社 Emotion estimation apparatus, method, and program
JP2020154938A (en) * 2019-03-22 2020-09-24 Kddi株式会社 Device, program, and method for estimating satisfaction level based on action information
JP2021026550A (en) * 2019-08-06 2021-02-22 Fringe81株式会社 Server and system for managing post
WO2022224661A1 (en) * 2021-04-22 2022-10-27 株式会社Nttドコモ Ability assessment device

Similar Documents

Publication Publication Date Title
USRE44953E1 (en) Activity recording module
US20180365618A1 (en) System And Method For Assessing Worker Engagement And Company Culture
US9058587B2 (en) Communication support device, communication support system, and communication support method
JP5372588B2 (en) Organization evaluation apparatus and organization evaluation system
US9767436B2 (en) System for managing formal mentoring programs
WO2010052845A1 (en) Information processing system and information processing device
EP2784731A1 (en) Electronic arrangement and related method for dynamic resource management
US20150142520A1 (en) Crowd-based sentiment indices
JP5588563B2 (en) Information processing system and information processing method
WO2011055628A1 (en) Organization behavior analyzer and organization behavior analysis system
US20080263080A1 (en) Group visualization system and sensor-network system
US10381115B2 (en) Systems and methods of adaptive management of caregivers
JP6478267B2 (en) Organization improvement activity support device, organization improvement activity support method, and organization improvement activity support program
US20140214710A1 (en) Job Search Diagnostic, Strategy and Execution System and Method
WO2007100977A2 (en) Enabling connections between and events attended by people
JP7205528B2 (en) emotion estimation system
WO2024219300A1 (en) Engagement estimation method, program, and engagement estimation system
JP6798353B2 (en) Emotion estimation server and emotion estimation method
JP2002269335A (en) Business support system
WO2024106309A1 (en) Engagement inference method, program, and engagement inference system
JP2007219787A (en) Device and method for analyzing business style
JP2023118969A (en) man hour system
US20150058096A1 (en) Method and System for Marketing Maturity Self-Assessment
JP6399420B2 (en) Information processing apparatus and information processing method
WO2024106308A1 (en) Qualitative burden measurement method, program, and qualitative burden measurement system