An Emotion Aware Task Automation Architecture Based on Semantic Technologies for Smart Offices
<p>Main classes of the ontologies involved in the semantic modelling.</p> "> Figure 2
<p>Emotion Aware Automation Platform Architecture.</p> "> Figure 3
<p>Deployment for the experiment.</p> "> Figure 4
<p>Results for Q2 and Q3.</p> "> Figure 5
<p>Results for Q4 and Q5.</p> ">
Abstract
:1. Introduction
2. Background
2.1. Emotion Aware AmI (AmE)
2.2. Smart Offices
2.3. Emotion Recognition
2.4. Emotion Regulation
2.5. Semantic Modelling
3. Semantic Modelling for the Smart Office Environment
4. Emotion Aware Task Automation Platform Architecture
4.1. Emotional Context Recognizer
4.2. Emotion Aware Task Automation Server
- (a)
- If stress level of a worker is too high, then reduce his/her task number. When a very high stress level in a worker has been detected, this rule proposes reducing his/her workload to achieve that his/her stress level falls and his/her productivity rises.
- (b)
- If temperature rises above 30 °C, then turn on the air conditioning. To work at high level of temperatures may result in workers’ stress, so this rule proposes to automatically control this temperature in order to prevent high levels of stress.
- (c)
- If average stress level of workers is too high, then play relaxing music. If most workers have a high stress value, the company productivity will significantly fall. Thus, this rule proposes to play relaxing music in order to reduce the stress level of workers.
5. Experimentation
- H1: The use of the proposed platform regulates the emotional state of a user that is under stressful conditions.
- H2: The actions taken by the proposed platform do not disturb the workflow of the user.
- H3: The use of the proposed system improves user performance.
- H4: The use of the system increases user satisfaction.
5.1. Participants
5.2. Materials
- Emotion Research software (https://emotionresearchlab.com/). This module provides facial mood detection and emotional metrics that are fed to the automation system. This module is an implementation that performs emotion classification in two main steps: (i) it makes use of Histogram of Oriented Gradients (HOG) features that are used to train with a SVM classifier in order to localize face position in the image; and (ii) the second step consists in a normalization process of the face image, followed by a Multilayer Perceptron that implements the emotion classification. Emotion Research reports 98% accuracy in emotion recognition tasks.
- A camera (Gucee HD92) feeds the video to the emotion recognizer submodule.
- Room lighting (WS2812B LED strip controlled by WeMos ESP8266 board) is used as an actuator on the light level of the room, with the possibility of using several lighting patterns.
- Google Chromecast [85] transmits content in a local computer network.
- LG TV 49UJ651V is used for displaying images.
- Google Home is used for communicating with the user. In this experiment, the system can formulate recommendations to the user.
5.3. Procedure
5.4. Design
5.5. Results and Discussion
6. Conclusions and Outlook
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Miorandi, D.; Sicari, S.; Pellegrini, F.D.; Chlamtac, I. Internet of things: Vision, applications and research challenges. Ad Hoc Netw. 2012, 10, 1497–1516. [Google Scholar] [CrossRef]
- Augusto, J.C. Ambient Intelligence: The Confluence of Ubiquitous/Pervasive Computing and Artificial Intelligence; Springer: London, UK; pp. 213–234.
- Gutnik, L.A.; Hakimzada, A.F.; Yoskowitz, N.A.; Patel, V.L. The role of emotion in decision-making: A cognitive neuroeconomic approach towards understanding sexual risk behavior. J. Biomed. Inform. 2006, 39, 720–736. [Google Scholar] [CrossRef] [PubMed]
- Kok, B.E.; Coffey, K.A.; Cohn, M.A.; Catalino, L.I.; Vacharkulksemsuk, T.; Algoe, S.B.; Brantley, M.; Fredrickson, B.L. How Positive Emotions Build Physical Health : Perceived Positive Social Connections Account for the Upward Spiral Between Positive Emotions and Vagal Tone. Psychol. Sci. 2013, 24, 1123–1132. [Google Scholar] [CrossRef] [PubMed]
- Nguyen, V.T.; Longin, D.; Ho, T.V.; Gaudou, B. Integration of Emotion in Evacuation Simulation. In Information Systems for Crisis Response and Management in Mediterranean Countries, Proceedings of the First International Conference, ISCRAM-med 2014, Toulouse, France, 15–17 October 2014; Springer International Publishing: Cham, Switzerland, 2014; pp. 192–205. [Google Scholar]
- Pervez, M.A. Impact of emotions on employee’s job performance: An evidence from organizations of Pakistan. OIDA Int. J. Sustain. Dev. 2010, 1, 11–16. [Google Scholar]
- Weiss, H.M. Introductory Comments: Antecedents of Emotional Experiences at Work. Motiv. Emot. 2002, 26, 1–2. [Google Scholar] [CrossRef]
- Bhuyar, R.; Ansari, S. Design and Implementation of Smart Office Automation System. Int. J. Comput. Appl. 2016, 151, 37–42. [Google Scholar] [CrossRef]
- Van der Valk, S.; Myers, T.; Atkinson, I.; Mohring, K. Sensor networks in workplaces: Correlating comfort and productivity. In Proceedings of the 2015 IEEE Tenth International Conference on Intelligent Sensors, Sensor Networks and Information Processing (ISSNIP), Singapore, 7–9 April 2015; pp. 1–6. [Google Scholar]
- Zhou, J.; Yu, C.; Riekki, J.; Kärkkäinen, E. AmE framework: A model for emotion-aware ambient intelligence. In Proceedings of the second international conference on affective computing and intelligent interaction (ACII2007): Doctoral Consortium, Lisbon, Portugal, 12–14 September 2007; p. 45. [Google Scholar]
- Acampora, G.; Loia, V.; Vitiello, A. Distributing emotional services in ambient intelligence through cognitive agents. Serv. Oriented Comput. Appl. 2011, 5, 17–35. [Google Scholar] [CrossRef]
- Beer, W.; Christian, V.; Ferscha, A.; Mehrmann, L. Modeling Context-Aware Behavior by Interpreted ECA Rules. In Euro-Par 2003 Parallel Processing; Kosch, H., Böszörményi, L., Hellwagner, H., Eds.; Springer: Berlin/Heidelberg, Germany, 2003; pp. 1064–1073. [Google Scholar]
- Coronado, M.; Iglesias, C.A.; Serrano, E. Modelling rules for automating the Evented WEb by semantic technologies. Expert Syst. Appl. 2015, 42, 7979–7990. [Google Scholar] [CrossRef]
- Muñoz, S.; Fernández, A.; Coronado, M.; Iglesias, C.A. Smart Office Automation based on Semantic Event-Driven Rules. In Proceedings of the Workshop on Smart Offices and Other Workplaces, Colocated with 12th International Conference on Intelligent Environments (IE’16), London, UK, 14–16 September 2016; Ambient Intelligence and Smart Environments. IOS Press: Clifton, VA, USA, 2016; Volume 21, pp. 33–42. [Google Scholar]
- Inada, T.; Igaki, H.; Ikegami, K.; Matsumoto, S.; Nakamura, M.; Kusumoto, S. Detecting Service Chains and Feature Interactions in Sensor-Driven Home Network Services. Sensors 2012, 12, 8447–8464. [Google Scholar] [CrossRef] [PubMed]
- Zhou, J.; Kallio, P. Ambient emotion intelligence: from business awareness to emotion awareness. In Proceedings of the 17th International Conference on Systems Research, Berlin, Germany, 15–17 April 2014; pp. 47–54. [Google Scholar]
- Kanjo, E.; Al-Husain, L.; Chamberlain, A. Emotions in context: Examining pervasive affective sensing systems, applications, and analyses. Pers. Ubiquitous Comput. 2015, 19, 1197–1212. [Google Scholar] [CrossRef]
- Kanjo, E.; El Mawass, N.; Craveiro, J. Social, disconnected or in between: Mobile data reveals urban mood. In Proceedings of the 3rd International Conference on the Analysis of Mobile Phone Datasets (NetMob’13), Cambridge, MA, USA, 1–3 May 2013. [Google Scholar]
- Wagner, J.; André, E.; Jung, F. Smart sensor integration: A framework for multimodal emotion recognition in real-time. In Proceedings of the 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, Amsterdam, The Netherlands, 10–12 September 2009; pp. 1–8. [Google Scholar]
- Gay, G.; Pollak, J.; Adams, P.; Leonard, J.P. Pilot study of Aurora, a social, mobile-phone-based emotion sharing and recording system. J. Diabetes Sci. Technol. 2011, 5, 325–332. [Google Scholar] [CrossRef] [PubMed]
- Gaggioli, A.; Pioggia, G.; Tartarisco, G.; Baldus, G.; Corda, D.; Cipresso, P.; Riva, G. A mobile data collection platform for mental health research. Pers. Ubiquitous Comput. 2013, 17, 241–251. [Google Scholar] [CrossRef]
- Morris, M.E.; Kathawala, Q.; Leen, T.K.; Gorenstein, E.E.; Guilak, F.; Labhard, M.; Deleeuw, W. Mobile therapy: Case study evaluations of a cell phone application for emotional self-awareness. J. Med. Internet Res. 2010, 12, e10. [Google Scholar] [CrossRef] [PubMed]
- Bergner, B.S.; Exner, J.P.; Zeile, P.; Rumberg, M. Sensing the city—How to identify recreational benefits of urban green areas with the help of sensor technology. In Proceedings of the REAL CORP 2012, Schwechat, Austria, 14–16 May 2012. [Google Scholar]
- Fernández-Caballero, A.; Martínez-Rodrigo, A.; Pastor, J.M.; Castillo, J.C.; Lozano-Monasor, E.; López, M.T.; Zangróniz, R.; Latorre, J.M.; Fernández-Sotos, A. Smart environment architecture for emotion detection and regulation. J. Biomed. Inform. 2016, 64, 55–73. [Google Scholar] [CrossRef] [PubMed]
- Jungum, N.V.; Laurent, E. Emotions in pervasive computing environments. Int. J. Comput. Sci. Issues 2009, 6, 8–22. [Google Scholar]
- Bisio, I.; Delfino, A.; Lavagetto, F.; Marchese, M.; Sciarrone, A. Gender-driven emotion recognition through speech signals for ambient intelligence applications. IEEE Trans. Emerg. Top. Comput. 2013, 1, 244–257. [Google Scholar] [CrossRef]
- Acampora, G.; Vitiello, A. Interoperable neuro-fuzzy services for emotion-aware ambient intelligence. Neurocomputing 2013, 122, 3–12. [Google Scholar] [CrossRef]
- Marreiros, G.; Santos, R.; Novais, P.; Machado, J.; Ramos, C.; Neves, J.; Bula-Cruz, J. Argumentation-based decision making in ambient intelligence environments. In Proceedings of the Portuguese Conference on Artificial Intelligence, Guimarães, Portugal, 3–7 December 2007; pp. 309–322. [Google Scholar]
- Hagras, H.; Callaghan, V.; Colley, M.; Clarke, G.; Pounds-Cornish, A.; Duman, H. Creating an ambient-intelligence environment using embedded agents. IEEE Intell. Syst. 2004, 19, 12–20. [Google Scholar] [CrossRef]
- Mennicken, S.; Vermeulen, J.; Huang, E.M. From today’s augmented houses to tomorrow’s smart homes: New directions for home automation research. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing, Seattle, WA, USA, 13–17 September 2014; pp. 105–115. [Google Scholar]
- Cook, D.; Das, S. Smart Environments: Technology, Protocols and Applications (Wiley Series on Parallel and Distributed Computing); Wiley-Interscience: Seattle, WA, USA, 2004. [Google Scholar]
- Marsa-maestre, I.; Lopez-carmona, M.A.; Velasco, J.R.; Navarro, A. Mobile Agents for Service Personalization in Smart Environments. J. Netw. 2008, 3. [Google Scholar] [CrossRef]
- Furdik, K.; Lukac, G.; Sabol, T.; Kostelnik, P. The Network Architecture Designed for an Adaptable IoT-based Smart Office Solution. Int. J. Comput. Netw. Commun. Secur. 2013, 1, 216–224. [Google Scholar]
- Shigeta, H.; Nakase, J.; Tsunematsu, Y.; Kiyokawa, K.; Hatanaka, M.; Hosoda, K.; Okada, M.; Ishihara, Y.; Ooshita, F.; Kakugawa, H.; et al. Implementation of a smart office system in an ambient environment. In Proceedings of the 2012 IEEE Virtual Reality Workshops (VRW), Costa Mesa, CA, USA, 4–8 March 2012; pp. 1–2. [Google Scholar]
- Zenonos, A.; Khan, A.; Kalogridis, G.; Vatsikas, S.; Lewis, T.; Sooriyabandara, M. HealthyOffice: Mood recognition at work using smartphones and wearable sensors. In Proceedings of the 2016 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops), Sydney, Australia, 14–18 March 2016; pp. 1–6. [Google Scholar]
- Li, H. A novel design for a comprehensive smart automation system for the office environment. In Proceedings of the 2014 IEEE Emerging Technology and Factory Automation (ETFA), Barcelona, Spain, 16–19 September 2014; pp. 1–4. [Google Scholar]
- Jalal, A.; Kamal, S.; Kim, D. A Depth Video Sensor-Based Life-Logging Human Activity Recognition System for Elderly Care in Smart Indoor Environments. Sensors 2014, 14, 11735–11759. [Google Scholar] [CrossRef] [PubMed]
- Kumar, V.; Fensel, A.; Fröhlich, P. Context Based Adaptation of Semantic Rules in Smart Buildings. In Proceedings of the International Conference on Information Integration and Web-based Applications & Services, Vienna, Austria, 2–4 December 2013; ACM: New York, NY, USA, 2013; pp. 719–728. [Google Scholar]
- Alirezaie, M.; Renoux, J.; Köckemann, U.; Kristoffersson, A.; Karlsson, L.; Blomqvist, E.; Tsiftes, N.; Voigt, T.; Loutfi, A. An Ontology-based Context-aware System for Smart Homes: E-care@home. Sensors 2017, 17, 1586. [Google Scholar] [CrossRef] [PubMed]
- Coronato, A.; Pietro, G.D.; Esposito, M. A Semantic Context Service for Smart Offices. In Proceedings of the 2006 International Conference on Hybrid Information Technology, Cheju Island, Korea, 9–11 November 2006; Volume 2, pp. 391–399. [Google Scholar]
- Picard, R.W.; Healey, J. Affective wearables. Pers. Technol. 1997, 1, 231–240. [Google Scholar] [CrossRef]
- Gyrard, A. A machine-to-machine architecture to merge semantic sensor measurements. In Proceedings of the 22nd International Conference on World Wide Web, Rio de Janeiro, Brazil, 13–17 May 2013. [Google Scholar]
- Araque, O.; Corcuera-Platas, I.; Sánchez-Rada, J.F.; Iglesias, C.A. Enhancing deep learning sentiment analysis with ensemble techniques in social applications. Expert Syst. Appl. 2017, 77, 236–246. [Google Scholar] [CrossRef]
- Pantic, M.; Bartlett, M. Machine Analysis of Facial Expressions. In Face Recognition; I-Tech Education and Publishing: Vienna, Austria, 2007; pp. 377–416. [Google Scholar]
- Sebe, N.; Cohen, I.; Gevers, T.; Huang, T.S. Multimodal approaches for emotion recognition: A survey. In Proceedings of the Electronic Imaging 2005, San Jose, CA, USA, 17 January 2005; Volume 5670, p. 5670. [Google Scholar]
- Mehta, D.; Siddiqui, M.F.H.; Javaid, A.Y. Facial Emotion Recognition: A Survey and Real-World User Experiences in Mixed Reality. Sensors 2018, 18, 416. [Google Scholar] [CrossRef] [PubMed]
- Anagnostopoulos, C.N.; Iliou, T.; Giannoukos, I. Features and Classifiers for Emotion Recognition from Speech: A Survey from 2000 to 2011. Artif. Intell. Rev. 2015, 43, 155–177. [Google Scholar] [CrossRef]
- Ayadi, M.E.; Kamel, M.S.; Karray, F. Survey on speech emotion recognition: Features, classification schemes, and databases. Pattern Recognit. 2011, 44, 572–587. [Google Scholar] [CrossRef]
- Vinola, C.; Vimaladevi, K. A Survey on Human Emotion Recognition Approaches, Databases and Applications. ELCVIA Electron. Lett. Comput. Vis. Image Anal. 2015, 14, 24–44. [Google Scholar] [CrossRef]
- Brouwer, A.M.; van Wouwe, N.; Mühl, C.; van Erp, J.; Toet, A. Perceiving blocks of emotional pictures and sounds: Effects on physiological variables. Front. Hum. Neurosci. 2013, 7, 1–10. [Google Scholar] [CrossRef] [PubMed]
- Campos, J.J.; Frankel, C.B.; Camras, L. On the Nature of Emotion Regulation. Child Dev. 2004, 75, 377–394. [Google Scholar] [CrossRef] [PubMed]
- Sokolova, M.V.; Fernández-Caballero, A.; Ros, L.; Latorre, J.M.; Serrano, J.P. Evaluation of Color Preference for Emotion Regulation. In Proceedings of the Artificial Computation in Biology and Medicine: International Work-Conference on the Interplay Between Natural and Artificial Computation, IWINAC 2015, Elche, Spain, 1–5 June 2015; Springer International Publishing: Cham, Switzerland, 2015; pp. 479–487. [Google Scholar]
- Philippot, P.; Chapelle, G.; Blairy, S. Respiratory feedback in the generation of emotion. Cogn. Emot. 2002, 16, 605–627. [Google Scholar] [CrossRef]
- Xin, J.H.; Cheng, K.M.; Taylor, G.; Sato, T.; Hansuebsai, A. Cross-regional comparison of colour emotions Part I: Quantitative analysis. Color Res. Appl. 2004, 29, 451–457. [Google Scholar] [CrossRef]
- Xin, J.H.; Cheng, K.M.; Taylor, G.; Sato, T.; Hansuebsai, A. Cross-regional comparison of colour emotions Part II: Qualitative analysis. Color Res. Appl. 2004, 29, 458–466. [Google Scholar] [CrossRef]
- Ortiz-García-Cervigón, V.; Sokolova, M.V.; García-Muñoz, R.M.; Fernández-Caballero, A. LED Strips for Color- and Illumination-Based Emotion Regulation at Home. In Proceedings of the 7th International Work-Conference, IWAAL 2015, ICT-Based Solutions in Real Life Situations, Puerto Varas, Chile, 1–4 December 2015; pp. 277–287. [Google Scholar]
- Lingham, J.; Theorell, T. Self-selected “favourite” stimulative and sedative music listening—How does familiar and preferred music listening affect the body? Nord. J. Music Ther. 2009, 18, 150–166. [Google Scholar] [CrossRef]
- Pannese, A. A gray matter of taste: Sound perception, music cognition, and Baumgarten’s aesthetics. Stud. Hist. Philos. Sci. Part C Stud. Hist. Philos. Biol. Biomed. Sci. 2012, 43, 594–601. [Google Scholar] [CrossRef] [PubMed]
- Van der Zwaag, M.D.; Dijksterhuis, C.; de Waard, D.; Mulder, B.L.; Westerink, J.H.; Brookhuis, K.A. The influence of music on mood and performance while driving. Ergonomics 2012, 55, 12–22. [Google Scholar] [CrossRef] [PubMed]
- Uhlig, S.; Jaschke, A.; Scherder, E. Effects of Music on Emotion Regulation: A Systematic Literature Review. In Proceedings of the 3rd International Conference on Music and Emotion (ICME3), Yväskylä, Finland, 11–15 June 2013; pp. 11–15. [Google Scholar]
- Freggens, M.J. The Effect of Music Type on Emotion Regulation: An Emotional Stroop Experiment. Ph.D. Thesis, Georgia State University, Atlanta, GA, USA, 2015. [Google Scholar]
- Gangemi, A. Ontology design patterns for semantic web content. In Proceedings of the 4th International Semantic Web Conference, Galway, Ireland, 6–10 November 2005; Springer: Berlin/Heidelberg, Germany, 2005; pp. 262–276. [Google Scholar]
- Prud, E.; Seaborne, A. SPARQL Query Language for RDF; Technical Report; W3C: Cambridge, MA, USA, 2006. [Google Scholar]
- Klyne, G.; Carroll, J.J. Resource Description Framework (RDF): Concepts and Abstract Syntax; Technical Report; W3C: Cambridge, MA, USA, 2006. [Google Scholar]
- Sporny, M.; Longley, D.; Kellogg, G.; Lanthaler, M.; Lindström, N. JSON-LD 1.0; Technical Report; W3C: Cambridge, MA, USA, 2014. [Google Scholar]
- Boley, H.; Paschke, A.; Shafiq, O. RuleML 1.0: The Overarching Specification ofWeb Rules. In Proceedings of the Semantic Web Rules: International Symposium, RuleML 2010, Washington, DC, USA, 21–23 October 2010; Springer: Berlin/Heidelberg, Germany, 2010; pp. 162–178. [Google Scholar]
- O’Connor, M.; Knublauch, H.; Tu, S.; Grosof, B.; Dean, M.; Grosso, W.; Musen, M. Supporting Rule System Interoperability on the Semantic Web with SWRL. In Proceedings of the Semantic Web–ISWC 2005: 4th International Semantic Web Conference, ISWC 2005, Galway, Ireland, 6–10 November 2005; Springer: Berlin/Heidelberg, Germany, 2005; pp. 974–986. [Google Scholar]
- Kifer, M. Rule Interchange Format: The Framework. In Proceedings of the Web Reasoning and Rule Systems: Second International Conference, RR 2008, Karlsruhe, Germany, 31 October–1 November 2008; Springer: Berlin/Heidelberg, Germany, 2008; pp. 1–11. [Google Scholar]
- Knublauch, H.; Hendler, J.A.; Idehen, K. SPIN Overview and Motivation; Technical Report; W3C: Cambridge, MA, USA, 2011. [Google Scholar]
- Berners-Lee, T. Notation3 Logic; Technical Report; W3C: Cambridge, MA, USA, 2011. [Google Scholar]
- Coronado, M.; Iglesias, C.A.; Serrano, E. Task Automation Services Study. 2015. Available online: http://www.gsi.dit.upm.es/ontologies/ewe/study/full-results.html (accessed on 18 April 2018).
- Schröder, M.; Baggia, P.; Burkhardt, F.; Pelachaud, C.; Peter, C.; Zovato, E. EmotionML—An upcoming standard for representing emotions and related states. In Proceedings of the International Conference on Affective Computing and Intelligent Interaction, Memphis, TN, USA, 9–12 October 2011; Springer: Berlin/Heidelberg, Germany, 2011; pp. 316–325. [Google Scholar]
- Sánchez-Rada, J.F.; Iglesias, C.A. Onyx: A Linked Data Approach to Emotion Representation. Inf. Process. Manag. 2016, 52, 99–114. [Google Scholar] [CrossRef]
- Grassi, M. Developing HEO Human Emotions Ontology. In Biometric ID Management and Multimodal Communication, Proceedings of the Joint COST 2101 and 2102 International Conference, BioID_MultiComm 2009, Madrid, Spain, 16–18 September 2009; Springer: Berlin/Heidelberg, Germany, 2009; pp. 244–251. [Google Scholar]
- Lebo, T.; Sahoo, S.; McGuinness, D.; Belhajjame, K.; Cheney, J.; Corsar, D.; Garijo, D.; Soiland-Reyes, S.; Zednik, S.; Zhao, J. Prov-o: The prov ontology. In W3C Recommendation, 30th April; 00000 bibtex: Lebo2013; W3C: Cambridge, MA, USA, 2013. [Google Scholar]
- Schröder, M.; Pelachaud, C.; Ashimura, K.; Baggia, P.; Burkhardt, F.; Oltramari, A.; Peter, C.; Zovato, E. Vocabularies for emotionml. In W3C Working Group Note, World Wide Web Consortium; W3C: Cambridge, MA, USA, 2011. [Google Scholar]
- Sánchez-Rada, J.F.; Iglesias, C.A.; Gil, R. A linked data model for multimodal sentiment and emotion analysis. In Proceedings of the 4th Workshop on Linked Data in Linguistics: Resources and Applications, Beijing, China, 31 July 2015; pp. 11–19. [Google Scholar]
- Sánchez-Rada, J.F.; Iglesias, C.A.; Sagha, H.; Schuller, B.; Wood, I.; Buitelaar, P. Multimodal multimodel emotion analysis as linked data. In Proceedings of the 2017 Seventh International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW), San Antonio, TX, USA, 23–26 October 2017; pp. 111–116. [Google Scholar]
- Sánchez-Rada, J.F.; Schuller, B.; Patti, V.; Buitelaar, P.; Vulcu, G.; Bulkhardt, F.; Clavel, C.; Petychakis, M.; Iglesias, C.A. Towards a Common Linked Data Model for Sentiment and Emotion Analysis. In Proceedings of the LREC 2016 Workshop Emotion and Sentiment Analysis (ESA 2016), Portorož, Slovenia, 23 May 2016; Sánchez-Rada, J.F., Schuller, B., Eds.; 2016; pp. 48–54. [Google Scholar]
- Khoozani, E.N.; Hadzic, M. Designing the human stress ontology: A formal framework to capture and represent knowledge about human stress. Aust. Psychol. 2010, 45, 258–273. [Google Scholar] [CrossRef]
- Coronado, M.; Iglesias, C.A. Task Automation Services: Automation for the masses. IEEE Internet Comput. 2015, 20, 52–58. [Google Scholar] [CrossRef]
- Verborgh, R.; Roo, J.D. Drawing Conclusions from Linked Data on the Web: The EYE Reasoner. IEEE Softw. 2015, 32, 23–27. [Google Scholar] [CrossRef]
- Sánchez-Rada, J.F.; Iglesias, C.A.; Coronado, M. A modular architecture for intelligent agents in the evented web. Web Intell. 2017, 15, 19–33. [Google Scholar] [CrossRef]
- Coronado Barrios, M. A Personal Agent Architecture for Task Automation in the Web of Data. Bringing Intelligence to Everyday Tasks. Ph.D. Thesis, Technical University of Madrid, Madrid, Spain, 2016. [Google Scholar]
- Williams, K. The Technology Ecosystem: Fueling Google’s Chromecast [WIE from Around the World]. IEEE Women Eng. Mag. 2014, 8, 30–32. [Google Scholar] [CrossRef]
- Cortina, J.M. What is coefficient alpha? An examination of theory and applications. J. Appl. Psychol. 1993, 78, 98. [Google Scholar] [CrossRef]
- Gaeta, M.; Loia, V.; Orciuoli, F.; Ritrovato, P. S-WOLF: Semantic workplace learning framework. IEEE Trans. Syst. Man Cybern. Syst. 2015, 45, 56–72. [Google Scholar] [CrossRef]
emo: | SmartSpeaker a owl:Class ; |
rdfs:label “Smart Speaker ” ; | |
rdfs:comment “This channel represents a smart speaker .” ; | |
rdfs:subClassOf emo:EmotionRegulator . | |
emo: | PlayRelaxingMusic a owl:Class ; |
rdfs:label “Play relaxing music ” ; | |
rdfs:comment “This action will play relaxing music .” ; | |
rdfs:subclassOf ewe:Action ; | |
rdfs:subclassOf hso:Therapy ; | |
rdfs:domain emo:SmartSpeaker . | |
emo: | SmartLight a owl:Class ; |
rdfs:label “Smart Light ” ; | |
rdfs:comment “This channel represents a smart light .” ; | |
rdfs:subClassOf emo:EmotionRegulator . | |
emo: | ChangeAmbientColor a owl:Class ; |
rdfs:label “ Change ambient color ” ; | |
rdfs:comment “This action will change ambient color .” ; | |
rdfs:subclassOf ewe:Action ; | |
rdfs:subclassOf hso:Therapy ; | |
rdfs:domain emo:SmartLight . |
:sad - emotion - detected a emo:EmotionDetected ; |
ewe:hasEmotion onyx:sadness . |
:play - music a emo:PlayRelaxingMusic ; |
ewe:hasSong “the title of the song to be played ” ; |
:change - ambient -color - green a emo:ChangeAmbientColor ; |
ewe:hasColor dbpedia:Green . |
:regulate - stress a ewe:Rule ; |
dcterms:title “ Stress regulation rule ”^ xsd:string ; |
ewe:triggeredByEvent :sad - emotion - detected ; |
ewe:firesAction :change - ambient - color - greenr . |
No. | Hypothesis | Question Formulation |
---|---|---|
Q1 | H1, H2 | In which section have you been more relaxed? |
Q2 | H1, H2 | What is your comfort level towards the environment? |
Q3 | H3 | Do you think the environment’s state has been of help during the completion of the task? |
Q4 | H4 | Would you consider beneficial to work in this environment? |
Q5 | H4 | What is your overall satisfaction with relation to the environment? |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Muñoz, S.; Araque, O.; Sánchez-Rada, J.F.; Iglesias, C.A. An Emotion Aware Task Automation Architecture Based on Semantic Technologies for Smart Offices. Sensors 2018, 18, 1499. https://doi.org/10.3390/s18051499
Muñoz S, Araque O, Sánchez-Rada JF, Iglesias CA. An Emotion Aware Task Automation Architecture Based on Semantic Technologies for Smart Offices. Sensors. 2018; 18(5):1499. https://doi.org/10.3390/s18051499
Chicago/Turabian StyleMuñoz, Sergio, Oscar Araque, J. Fernando Sánchez-Rada, and Carlos A. Iglesias. 2018. "An Emotion Aware Task Automation Architecture Based on Semantic Technologies for Smart Offices" Sensors 18, no. 5: 1499. https://doi.org/10.3390/s18051499