Nothing Special   »   [go: up one dir, main page]

Next Article in Journal
A Comparative Study of Skeuomorphic and Flat Design from a UX Perspective
Next Article in Special Issue
Animal-to-Animal Data Sharing Mechanism for Wildlife Monitoring in Fukushima Exclusion Zone
Previous Article in Journal
‘I Just Want It to Be Done, Done, Done!’ Food Tracking Apps, Affects, and Agential Capacities
Previous Article in Special Issue
A Wearable Sensor System for Lameness Detection in Dairy Cattle
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Seven Years after the Manifesto: Literature Review and Research Directions for Technologies in Animal Computer Interaction

1
Department of Computer Science, Aalto University, 02150 Espoo, Finland
2
Departamento de Sistemas Informaticos y Computacion, Universitat Politècnica de València, 46022 Valencia, Spain
3
Department of Computer Science, University of Central Lancashire, Preston PR1 2HE, UK
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2018, 2(2), 30; https://doi.org/10.3390/mti2020030
Submission received: 18 April 2018 / Revised: 25 May 2018 / Accepted: 28 May 2018 / Published: 1 June 2018
(This article belongs to the Special Issue Multimodal Technologies in Animal–Computer Interaction)
Figure 1
<p>Representation of the gulf of execution in ACI systems.</p> ">
Figure 2
<p>Framework for technologies in ACI (building from Jukan et al. [<a href="#B18-mti-02-00030" class="html-bibr">18</a>]).</p> ">
Figure 3
<p>Button system used as a pressure plate to dispense treats [<a href="#B30-mti-02-00030" class="html-bibr">30</a>].</p> ">
Figure 4
<p>A dog activating a switch [<a href="#B21-mti-02-00030" class="html-bibr">21</a>]. Photo courtesy of The Open University.</p> ">
Figure 5
<p>Posture system used by Majikes et al. [<a href="#B36-mti-02-00030" class="html-bibr">36</a>].</p> ">
Figure 6
<p>Olfaction cancer detection system [<a href="#B41-mti-02-00030" class="html-bibr">41</a>]. Photo courtesy of The Open University.</p> ">
Figure 7
<p>Dog training to click on points on a touchscreen interface [<a href="#B8-mti-02-00030" class="html-bibr">8</a>].</p> ">
Figure 8
<p>Apps for Apes: An orangutan using a touchscreen [<a href="#B101-mti-02-00030" class="html-bibr">101</a>].</p> ">
Figure 9
<p>Human and orangutan playing together with a touchscreen interface [<a href="#B97-mti-02-00030" class="html-bibr">97</a>].</p> ">
Figure 10
<p>Using a head-mounted, eye-tracking system with dogs [<a href="#B110-mti-02-00030" class="html-bibr">110</a>].</p> ">
Figure 11
<p>Tracking cats using depth measurement via an Xbox Kinect to detect posture [<a href="#B51-mti-02-00030" class="html-bibr">51</a>].</p> ">
Figure 12
<p>Tracking dogs using posture recognition via an Xbox Kinect [<a href="#B114-mti-02-00030" class="html-bibr">114</a>].</p> ">
Versions Notes

Abstract

:
As technologies diversify and become embedded in everyday lives, the technologies we expose to animals, and the new technologies being developed for animals within the field of Animal Computer Interaction (ACI) are increasing. As we approach seven years since the ACI manifesto, which grounded the field within Human Computer Interaction and Computer Science, this thematic literature review looks at the technologies developed for (non-human) animals. Technologies that are analysed include tangible and physical, haptic and wearable, olfactory, screen technology and tracking systems. The conversation explores what exactly ACI is whilst questioning what it means to be animal by considering the impact and loop between machine and animal interactivity. The findings of this review are expected to form the first grounding foundation of ACI technologies informing future research in animal computing as well as suggesting future areas for exploration.

1. Introduction

The well-being, behaviours, and physical characteristics of animals have long been studied within animal biology sciences but the landscape changes as the understanding of animals evolves. In the late twentieth century, studies were conducted into the ways that some animals behave in human-animal situations and subsequently, these studies have moved towards the ability of animals to assist humans and thus improve the human condition [1]. As technology has become embedded in the human condition, it has also become of interest in terms of how it affects the human-animal relation aiming to move away from anthropocentric work towards an animal-centric focus. Technology today has been shown to be useful for playful interactions between humans and animals [2,3], for monitoring animals [4,5], training animals [6] and supporting animals that care for humans [7,8]. This has driven researchers, for societal and economic reasons, to explore animals within technological situations.
One of the main initial aims for the study of Animal Computer Interaction (ACI) has been “to understand the interaction between animals and computing technology within the contexts in which animals habitually live, are active, and socialize with members of the same or other species, including humans” [9]. As a relatively new field, being coined in 2011 in the ACI Manifesto [9], it has taken its main reference from Human Computer Interaction (HCI) [3,4,10], which in turn has led to an early focus on studies of the usability of technology and the user experience of animals to influence the design of interactive solutions [11,12]. Frameworks have been constructed for ACI technology in the areas of interaction design [12], Human Computer Interaction (HCI) [3], ubiquitous computing [5] and game design [13]. Some of these frameworks aim to reveal the role of technology within a human-animal interaction [3,5], whilst others aim to minimise the human role to more fully design for the animals’ unique needs. Whilst motivation for animal-computer technologies is often welfare based [5], ACI also attends to other aspects, including the pet entertainment and holistic well-being sectors where many commercially available products exist [14,15]. The terminology of welfare we use within this work is not only in reference towards the animal being healthy, nourished, safe, able to express innate behaviour, comfortable and not suffering from any negative states (as defined by medical agencies) but also in viewing the animal as a ‘whole’ [16]. Within ACI, welfare is inherently linked towards animal centeredness by researchers who allow consent through walking away behaviour (innate behaviour), research into how to make systems more suitable for animals (comfortable), and often seeking ways to monitor health (healthy and nourished).
Academic studies pertinent to the design of ACI technologies have increased in number over the last seven years since the publication of the ACI Manifesto [9], the introduction of the ACISIG at the CHI conference [17], the first, second, third, fourth and the coming-soon fifth International Animal-Computer Interaction Conferences (2014, 2015, 2016, 2017, 2018), and workshops at major HCI conferences (ISAWEL’14, ACI@BHCI 2015, NordiCHI 2014 and 2016 and OzACI@OzCHI 2017). As interest has grown in this field the workshops and events have been become more specialised with: ACI@Measuring Behavior 2016, HCI goes to the zoo at CHI’16, Research Methods for ACI, ZooJam at ACI’16, and Technology for Bonding in Human-Animal Interaction and FarmJam at ACI’17.
However, as we embark on the seventh year since the ACI Manifesto [9] there has yet to be an in-depth literature review delineating from the foundations of ACI, towards the creation and use of technical means and their interrelation with animals, looking forwards towards potential areas for future research. Whilst literature reviews around the field of ACI exist, such as the one for smart computing and sensing technologies for animal welfare [18], there has yet to be a direct overview of technologies within ACI. This chronicle begins by briefly exploring what ACI is and considering how the fields of animal behaviour and HCI intersect and contribute towards the embodied work. What an interaction is, or can be, defined as is questioned in this narrative. A thematic analysis of technologies within ACI is then delivered to investigate how the current body of research adds to the current overall field narratives. Drawing from this, a discussion is held on potential technological areas that ACI has yet to address, identifying questions opened through this review and concluding in an overall summary of the field.
This technology driven thematic literature review is intended to both bring clarity to those entering the field whilst highlighting potential areas of interest for those currently working in the field. Whilst this review does not tackle ethical, methodological, legal, economic and philosophical issues surrounding ACI, it is hoped that those embroiled in such topics may find this narrative useful in initiating discussions.

2. What Is ACI?

The natural question that opens the literature review, and is our starting point, is ‘What is ACI?’ At first glance, ACI can be defined by its components: the animal, the computer, and the way they work together (as HCI is defined [19]). It can also be defined, as identified within its early work, by the main goal that it seeks to meet, that being: ‘usability through a discussion about factors involved such as constraints, functionally and the user’ [20,21].
In seeking to differentiate ACI from HCI, however, it is important to step back and consider what we define as an animal. The Cambridge Dictionary (2016) offers two definitions:
1.
Something that lives and moves but is not a human, bird, fish, or insect.
2.
Anything that lives and moves, including people, birds, etc.
These definitions expose contrasting views and show two ways of looking at ACI: either as (1) an offshoot of computer science into non-human animals or (2) the encircling of HCI, CCI (Child Computer Interaction) and other subfields, into an overall look at all animals, including humans as animals. Whilst the debate over the distinction between human and non-human animals has far-reaching roots back to Darwin’s approach in the Origin of Species [22], it is probably fair to say that it is largely about humans’ unique abilities, and beliefs, about the uniqueness of species. This latter point has been interpreted differently over time according to the mood of the day and the understanding of mind and action. Biologically speaking, the definition of animal refers to all members of the kingdom Animalia (The American Heritage Dictionary) but colloquial use of animal frequently refers to non-human animals in an umbrella terminology. The tension between the two positions challenges ACI to consider methodologically the position that animals hold within the research space. Tattersall describes well the problem space writing that ‘We have similarities with everything else in nature; it would be astonishing if we didn’t. But we’ve got to look at the differences’ [23]. From here on in, this narrative will refer to ACI as assuming the exclusion of the human for clarity but will descend from the vantage point of anti-dichotomy within the animal hierarchy by focusing on, like Tattersall, the differences between species’ use of computers and computer technology.
Reflecting on the human-animal difference, the field of ACI emerged in computer science research via HCI but technology had previously been used to explore animal behaviour in other research fields (e.g., bio-logging within animal ecology and technological interventions for animal cognition studies). The inclusion of the term ‘computer’ in ACI assumes that the technology with which the animal is interacting, or which is facilitating some behaviour, is embedded with computer technology and so is able to react and interact with elements in the environment.

What Is Interaction?

Interaction, and its study, is elemental to HCI and thus also to ACI. In ACI this interaction always includes the animal and the technology but often will also include a human owner, researcher and/or carer. The study of ACI aims to enhance interaction by developing methods, philosophical stances and theories within this space. However, the terminology of what it means to have an interactive system has not yet been clearly defined within ACI. Interaction can be seen in a broad way as the framing of the relationship between people and the objects designed for them [24] but in ACI, as in HCI, interaction is more often seen as an archetypal structure, such as the feedback loop [25] where reference is made to ’an interaction,’ which is the communication between system and user. This maps onto what is described in animal behaviourism as stimuli and responses [26]. Within this definition, interaction in ACI refers to the way that the animal reacts to the technology and in return the way that the technology then responds to the animal within the feedback loop. This typical definition has been questioned by Aspling & Juhlin [27] who instead refer to interaction as a dyadic, direct and strategic interaction between multiple agencies arguing instead for Actor-Network Theory and Goffman’s [28] notion of strategic interaction in ACI.
The term ‘interaction’ is used throughout this literature review, but it is acknowledged, in the sense of Buchanan’s [24] definition of interaction, that the degree to which an animal can meaningfully interact with a computer system is unknown as animals’ intentions, and what animals perceive as possible to do within a computer system, are unidentified. This is not to imply that animals cannot have implicit or unaware interactions that are meaningful, but that the scope behind terming what is meaningful to an animal within the interaction is unknown. In HCI, this degree of representation is described in the theoretical framework coined by Norman [29] as ‘the gulf of execution’. A model of this within ACI is shown in Figure 1. Research endeavour in ACI explores this ‘gulf’ of the animal user’s intentions [20,30,31] and considers if this can be represented, directly perceived and interpreted [8]. This is explored through trying to capture the animal user’s actions and intentions. In human-human and human-computer communication, there is a rich two-way feedback loop of interactivity where there is derived meaning gathered from the actions taken, the interpretation, and in return the output delivered. There also appears evidence of this feedback loop in animal-human communication between animal and trainer, such as when they interact.
It is in this way that often ACI explores the gulf of execution, the top half of the feedback loop, and it is acknowledged that the loop may not be fully closed. In ACI this is due to the bottom half of the loop (gulf of evaluation) being unknown, that is, an animal interaction can be captured, but what meaning this has to the animal (interpretation) is yet to be discovered. For these reasons, ACI is not primarily about designing complete interactive systems but more about exploring elements within them. These elements include ‘looking at behaviours’ and ‘attending to behaviours’, but not the reasoning behind animals’ actions (intentions). Within this space, all we can do as researchers is interpret these behaviours. This same interpretation is very often the case in Child Computer Interaction (CCI) and sometimes also in adult studies in those cases where it can be hard to draw at the intention behind the interaction loop. As ACI is a relatively new field, there is clearly more research to be undertaken to explore these gulfs, particularly with support from the animal psychological and behavioural fields.
The feedback loop presented in Figure 1 has also been modelled by Freil et al. [32], who also extended from Norman’s [29] gulfs towards a dog computing training scenario. Unlike as described here, Freil et al. [32] considered the model to be fully closed. Our view, in this paper, is that for the gulf of evaluation to have execution errors such as slips and mistakes, this requires specific knowledge of the animal’s intentions. This literature review therefore considers the animal’s intentions (as termed by Freil et al. [32]) to be human interpretations (as noted within the gulf of evaluation above) of the animal’s behaviour.
Drawing back to the published literature from ACI, interaction has been considered here in a broad sense including the animal controlling a system [7,8,33,34], systems detecting an animal’s behaviour [35,36], systems reacting to an animal’s behaviour [30,37] and the animal interacting with the human through its behaviour [4,6,11,38,39,40]. As noted within the ACI manifesto [9], the interaction can also be explored by improving the usability of systems as well as by creating a meaningful experience for the animal. This is evidenced in work that has studied how animals can input to technology [8,41,42,43], how animals can be soothed or stimulated by technology [30,44,45] and how animals and humans can be connected through technology [4,6,38,40,46]. An interesting point within these intersections is the transferal of these technologies across species and across disciplines, that will also be addressed in this manuscript through modelling the technological system’s space within ACI.

3. Technologies in ACI

This section reviews the ACI literature in terms of the different technological approaches present in ACI related works. The following subsections describe these innovations in further detail: tangible and physical objects, haptic and wearable technologies, olfactory interfaces, screen interfaces and tracking mechanisms. This classification has been originated into subgroups via the thematic analysis of the technology interface in order to explore those areas in further depth. Each section will summarize the current narrative of the technology interface and will then bring these findings into an overall model of technologies in ACI. This narrative aims to provide a general technological picture of the field, with future trends and opportunities discussed later on in Section 4.

3.1. Model of Technologies within ACI

Figure 2 illustrates the framework, proposed by this literature review, building from Jukan et al.’s [18] model regarding technologies for animal welfare. The framework proposed here focuses more towards overall technologies in ACI and thus expands Jukan et al.’s work. Reviewed here are: Tangible & Physical technologies, Haptic & Wearable technologies, Olfactory interfaces, Screen-based interfaces and Tracking technologies. These various technologies seek to aid animals in various instances; such as to aid human health, control the animal, enable data exchange, assist working animals, aid service animals, enrich play and monitor the animal.
This review, as suggested within Mancini’s early work [47], will include technologies directly involved within animal computing and not only technologies informed by animal computer interaction. This appreciates the difference between a technology user and an animal within the technology scenario. Lawson et al. [48] tackled this issue as it refers to consent by drawing from the ACI Manifesto goals and paraphrasing Eric Baumer’s terminology of “usees”, being those situations where technologies are imposed upon animals without consideration, making them inherent users and participants submitting their data. Whilst it is beyond the scope of this work to define the terminology of a user, it is noted that not all animals that are directly involved in technologies are active users—some may instead be a wearer [36,49,50] or may inform a system [51,52] rather than being an active user inputting information [8,43] or directly controlling a system [53].
Following this classification presented above in Figure 2, Table 1 presents a list of the works considered within this literature review.

3.2. Tangible and Physical Objects

As technologies began becoming intertwined with animals, these early systems primarily focused around the animal-human communication paradigm where humans sought to communicate with animals. These systems included the LANA (LANguage Analogue) project where chimpanzees used “lexigrams” to create sentences and communicate with humans [116] and early button systems to allow dolphins to ask for certain toys and food [117]. Whilst early research focused around the cognitive abilities of animals, Resner [4] challenged this by looking at the animal-technology relationship in more of a HCI stance focusing on Interaction Design (IxD). Hu et al. [60] took this idea further by creating a web-based system to allow humans to remotely interact with their dogs by giving them treats, talking to them through speakers or throwing a tennis ball they could catch, The system aimed to improve pet-human interaction and came from a HCI IxD standpoint. These early archetypes were primarily based on tangible and physical objects and collectively contribute a large number of ACI technology interfaces.
In ACI, exploration has been conducted with tangible and physical objects such as pulling devices [43,64], buttons [21,30] (Figure 3), digitally augmented toys such as tree trunks and pulleys [55,118] and plastic balls [62], and with robots [56,57].
Robinson et al. [43] created a pulley system for a medical assistant dog to call for help using a tug toy as an interaction mechanism, as this is a familiar way of interaction for a dog. Following an iterative user-centred design process, in which they tested several materials and configurations, they developed a high-fidelity prototype of a dog alarm system that could work for several scenarios [64].
Tangible objects have also been explored within ACI in their ability to give animals control over their environment and any proposed technological interventions. Along this line of thought, French et al. [45] used ordinary items found within an elephant’s enclosure to allow the automatic use of devices—in this case a shower. With orangutans, Pons et al. [62] studied how non-technological everyday objects could be augmented with auditory digital responses in order to provide a novel form of enrichment for these animals, building on their intrinsic dexterity for object manipulation. More recently, Gupfinger and Kaltenbrunner [58] have explored the use of tangible technological mediators with grey parrots so that they can produce sounds and music by activating a joystick or a rope swing; this builds from Ritvo & Allison’s [33] work with orangutans and music (see Section 3.4). A similar approach was followed to develop a tangible cylinder for an orangutan’s enclosure that produces sounds when rotated [59]. The cylinder, in this case, was attached to the wall and it had a maze-puzzle embedded in it. Orangutans explored the cylinder freely, by rotating or touching it, which produced sounds not only in their enclosure but also in the human visitors’ side, augmenting and enriching the human experiences of viewing captive animals at a zoo. Technology has opened a whole new range of possibilities in terms of animal enrichment, as everyday objects can now be enhanced with sensors to create more varied scenarios. For example, researchers and staff at San Francisco Zoo have created a giant puzzle feeder for rhinos [61]. The system dispenses treats when the animals investigate and manipulate it, fostering their natural foraging behaviours [61].
Button-like devices have been one of the most common approaches when working with dogs, allowing these animals to intuitively interact with either their paws or noses. Geurtsen et al. [30] used used a pressure plate button to give dogs treats—a method in line with current consumer products for dogs such as CleverPet [15]. Another investigation of button systems with service dogs was by Mancini et al. [21] who sought to look at extending dog accessibility towards their environment in mobility dogs by mapping out the challenges faced in a human domain (Figure 4).
Gergely et al. [57] examined interaction with Unidentified Moving Objects (UMO) by investigating how dogs act socially with robots. Gergely et al. [57] found that dogs act socially towards UMOs from expectations of the system over a short period, offering up an indication towards social robots for dogs. Previous to this work, Singh & Yong [119] used robotic tails on robots which could move positions (wag, raise, lower and hold straight) to investigate dog tail communication states; this work is yet to be tested with other dogs to draw conclusive results. Westerlaken et al. [65] used a design observational study with dogs and robots, applying different variations to the robotic devices, such as puzzle or treat-dispenser cases, and with tangible objects that produced sound or smell. They observed that the material of the robot conditioned the interest of the dogs in the game: when the dogs realized they could not grab the robotic ball with their mouths, they began to lose interest in the game. One of their main observations was that different traits in each individual dog could lead to different kinds of playful preferences. In a similar fashion, Pons et al. [63] conducted an observational study with seven cats and two different small robots to methodologically investigate cats’ interactions and to see which devices the cats preferred. The authors found that the age and size of the cats was a factor, as cats usually preferred to interact with smaller robots in order to replicate a playful hunting behaviour with them. This aligns with Westerlaken et al.’s [65] proposition of animals’ personal traits playing an important role within the interactions.
Byrne et al. [54] have recently proposed a technological approach to predict the suitability of dogs for assistive dog training programs. The authors have explored whether aspects of canine temperament can be detected from dogs’ interactions with sensors embedded in two instrumented dog toys: a silicone ball and a silicone tug sensor. From the dog’s natural interactions with the instrumented devices, a prediction model has been created that allows for the assessment of the dogs’ outcomes in the program with an 87.5% average accuracy.
Whilst tangible and physical objects have a long history within animal technologies, the way in which these types of interfaces have been investigated has developed from a purely cognitive and behavioural standpoint towards a more interactive paradigm. Within this area the largest body of work conducted has been for service and working animals, with recent moves towards the zoo and pet fields. There are clear gaps present within the research to investigate not only new technologies, but also to investigate how animals respond to, and can be trained more efficiently towards, technological systems that they hold, sniff, point, paw, tug and use in both regards towards the interface and interaction model. As the ACI field advances, work is currently shifting towards a non-training approach, especially in the context of enrichment, play or technologies aiming towards giving the animal control. A more animal-centric design perspective in these scenarios has been to deploy the technology and let the animal “become with” [102,120], where no interaction is wrong, so as to iteratively re-design the technology based on the observed interactions in a research through design fashion [118]. More research has yet to be done in terms of animal-centred design methodologies to account for these new insights as the field knowledge grows, questioning which user (man or animal) is really in the centre of the design [121].

3.3. Haptic and Wearable Technologies

Haptic technology is defined as “the science of applying touch sensation and control to interact with computer developed applications” [122]. Therefore, a haptic device allows the user interface to be the animal’s body sensations; this includes especially tactile feedback to perform actions and receive input. One method of instantiating a haptic interface is the use of vibrotactile technologies which can range from skin surface monitoring to vibrating interfaces. Lee et al.’s [11] work, one of the first ACI contributions, used a haptic wearable jacket on chickens in order to allow a human user to remotely stroke the chicken. A similar approach was presented by Réhman & Li [80], who proposed remote communication between humans and animals via vibrotactile feedback for the animal. More recently, haptic vibrotactile interfaces have been implemented for dogs, using the same research method as Lee et al. [11]. Britt et al. [34] trained a dog using a vibrotactile haptic vest that allowed a human handler to remotely guide the dog using vocal commands as well as applying vibrations on the vest. This idea has also been explored by Byrne et al. [68] using haptic cues to assist in training, an approach also evident in Morrison et al. [6] who iteratively designed a wearable vibrotactile vest to assist in direction pointing for hunting dogs, arguing that vibrotactile input aids the collaborative discussion within hunting between dog and owner. These vibrotactile haptic interfaces, however, have so far only been used with dogs and chickens, where these devices have been proposed to have a positive reaction for animals [11,38] and successful training in reported behaviours [6,34,68], leaving this User Interface (UI) open for future exploration.
Vibrotactiles however, are not constrained to animals using wearable haptics. French et al. [55] developed vibrotactile buttons made of different materials for elephants. These interactive devices were aimed towards triggering elephants’ curiosity to explore the device based on the haptic feedback they received when approaching the material with their trunks. The vibrotactile buttons allowed elephants to produce different sounds in their enclosure. In one extreme case, vibrotactile feedback was used with insects (crickets) by applying vibrations to the ground on which the crickets stood [84]; although the authors did face problems with animal shedding in reaction to vibrations. A follow up study investigated if playing PAC-MAN against an Artificial Intelligence (AI) is perceived as ‘funnier’ than playing against a system replicating animals’ movements [85].
Wearable computing has been defined as “the study or practice of inventing, designing, building, or using miniature body-borne computational and sensory devices” [123]. Wearables in HCI have been used for a wide variety of purposes, and their proliferation has also reached the field of ACI. For instance, biotelemetry devices can be considered as wearable interfaces although there remains an argument on the term ‘user’ being applied to this scenario, being preferred the term ‘wearer’. Biotelemetry devices have been used for many years in biological research, playing an important role in the development of behavioural science and in ACI research which looks at the wearability of these devices [50,77]. Biotelemetry devices have also been used to inform blind dog owners of real-time heartbeat and respiration rates of their dogs while taking a walk [40]. The information provided by the biotelemetry sensors on the dog was transmitted to the human handler by means of either vibrotactile feedback or audio devices. Mealin et al. [40] found that although vibrotactile interfaces provide more accurate responses, dog handlers preferred to use audio devices. In a more recent work Mealin et al. [76] designed a system that collected physiological data from guide dogs in training using wearable devices. The collected data facilitated objective analysis of the dog during early stages of training, helping to predict how successful a dog would be in the program.
Several works have addressed the necessity of tracking animals, beyond biotelemetry devices, in different scenarios, this is expanded on in Section 3.6. However, despite the increase in tracking technologies, the most common method to gather information about the animal has been to use wearable harnesses or collars with attached technological devices providing information to the system in charge of processing the information. One of the most basic methods for animal tracking in outdoor scenarios has relied on GPS or radio-frequency localization, attaching the emitter devices to a collar or harness. These systems only give information on the animal’s location and have been used by pet owners, mostly to assess their dogs’ locations and to determine whether or not they are in trouble [75]; this technology also exists in commercial products such as FitBark [14]. This technology has also been used during hunting activities with dogs, allowing the human leading the hunting activity to interpret the movements of the dog in the field by following its signal on a handheld display [78,87].
Whilst GPS is useful, several outdoor scenarios require more precise information about the animals’ movements or body postures during the activity, and even some kind of communication from the animal to the human side. As an example, determining the pose of the animal is of vital importance in the case of Search and Rescue (SAR) dogs which often have to work away from human sight; it would be beneficial if the handlers knew the dogs’ location and pose. This type of recognition of animals’ postures and activities is usually performed using accelerometers, gyroscopes or other inertial measurements. Most of the works in ACI using these products have been with dogs, probably because of their use as working and assistive animals [66,91]. Most of the works based on wearable devices for activity/posture recognition are based on the use of a tri-axial accelerometer located at the dog’s collar, these then apply classification techniques to the data obtained from the accelerometer in order to recognize the activity/posture. There are several devices for dogs, some of them even commercial, such as Whistle® [92], FitBark® [14] or WagTag™ [88] which make use of a tri-axial accelerometer to perform basic activity level recognition. In [74], dogs wear a tri-axial accelerometer on the collar and, after being trained with a k-nearest neighbours (kNN) classifier, the system is able to differentiate between 14 activities and 2 postures.
Within the FIDO project [72], researchers have been studying how wearable devices could mediate the communication between working dogs and their handlers, much like Morrison et al. [6]. The FIDO project has undertaken extensive work on providing dogs with suitable wearable activators [73]. In addition, they have also considered to mediate this communication by recognizing motion-based dog gestures—sit, spin, roll, jump, etc.—using a three-axis accelerometer attached to the front of a service dog harness [82]. More recently, they have studied the use of a dog collar with an accelerometer and gyroscope for the recognition of head gestures on dogs [35,83]. Whilst they struggled with the sensitivity of such devices, the researchers did find that gesture recognition through collars was viable and they pointed to looking to how a dog was trained (i.e., with a leash) to give an indication of gestures that could be instantiated in such systems.
The effectiveness of wearable harnesses with several inertial measurement units located along the harness has also been studied. The work of Ribeiro et al. [81] uses the angles of two accelerometers on different locations on the dogs’ harness to develop an algorithm capable of estimating four poses including: standing, lying down, sitting, and walking. Another project [66,67] extended this idea by using more inertial measurement units located on the optimal locations of a dog’s body, which have been determined attending to the algorithm’s performance and the dog’s comfort and physiognomy. Using the information provided by these units and applying machine learning techniques, five static postures and three dynamic behaviours can be identified. They have also compared the performance of the classification algorithm using supervised against unsupervised classification methods [52]. The knowledge from these previous studies came together in Majikes et al. [36] research, which used a harness vest system with dogs, like those used by Byrne et al. [68], Britt [34] and Lemasson et al. [38], to monitor a dog’s posture during eating, standing, lying, sitting and standing on two legs (Figure 5).
In the vest system for dogs, Majikes et al. [36] extended the usefulness of haptic devices by mixing the vest outputs with human analysis for interpretation and found that this can lead to a higher rate in successful training. The authors in the future hope to take this device into a fully autonomous system and have cue trained behaviours leading to complex behaviours.
Acceleration data-loggers are also a common and efficient way of detecting cats’ body postures and frequent behaviours based on movement [86]. Commercial devices for cat activity recognition are also available, such as PawTrack® [79], which detects whether the cat is at home or outside, and offers GPS geolocation for outdoor walks. However, it does not monitor any activity nor gesture such as the non-commercial research Cat@Log [90] device. Cat@Log consists of a cat collar device with several sensors: a camera, a GPS, an accelerometer, a Bluetooth module, a battery and a micro SD card. The camera provides videos of the cat’s view, while the accelerometer data allows activity recognition such as sleeping, jumping, walking or scratching.
Canine Amusement and Training [89] present a home based wearable tracking system for dogs away from accelerometers. This consists of Infra-Red (IR) emitters attached to the dog’s harness, and a Wiimote’s IR camera placed on the ceiling. The system detects the location and posture of the animal by tracking the IR emissions of the harness using the Wiimote. The detected postures and location are used by the system to determine whether the dog is performing correctly the proposed training activities offered by the system.
In recent years, ACI technology has also considered how wearable technology can improve animal welfare in farming scenarios. Haladjian et al. [70] studied motion sensors inserted inside a pig’s ear in order to classify the pig’s physical activities into ‘walking’, ‘eating’ and ‘resting’. This ongoing work would enable veterinarians to keep track of free-roaming pigs and the pilot study showed the approach was viable and would even enable the tracking of pig activities with an accuracy of 95.8%. In a similar manner, Haladjain et al. [71] implemented and evaluated a wearable device to be attached to a cow’s hind left leg. This device could detect gait anomalies in cows in an effective way for large cow populations. The proposed device builds an individual model of the usual walking pattern of a cow during the first minutes of use and detects deviations from this model, reaching a 91.1% of accuracy. Recently, Carpio et al. [69] proposed a novel smart farming system and application framework based on wearables and cloud computing, with an emphasis on animal welfare features for cows and pigs. As opposed to haptics, it can be observed that wearable interfaces not always imply that the animal is an active user of the technology. In some scenarios in which the technology is placed on the wearable for sensing purposes, the animal might not be required to make any kind of interaction, other than being just a wearer, and thus is clearly not aware of the data exchange that is happening between the wearable technology and the data receiver.
As laid out above, the research and commercial space for wearable and vibrotactile devices has grown significantly in the last few years beginning with chicken and dog systems, and recently extending into livestock and farming scenarios. These systems, in general, are aimed towards working and service animals, focusing on control and data exchange. However, there is clearly a gap between using wearable interfaces to quantify the animal’s actions, and pairing this data with vibrotactile interfaces to allow a two-way feedback from the animal to the human and vice versa. In addition, as more of these devices are deployed it is essential to continue studying how the use of haptics, wearables and biotelemetry devices affects the behaviour of the animal wearer.

3.4. Olfactory Interfaces

Grounded within the principles of animal-centred design, Johnston-Wilder et al. [41] (Figure 6) and Mancini et al. [37] have created interfaces to provide supplementary information, in olfactory detection of cancer by dogs, using pressure plates. These studies, building from dogs’ olfactory work, have found that an olfactory system is a possible interaction method within ACI, highlighting the potential of this approach as it allows for the dogs’ natural behaviour while sniffing the samples. Analysis on the pressure patterns from the sensors allowed to distinguish between positive, negative and uncertain samples, removing errors derived from human interpretation of the dog’s interactions. Currently these authors are further exploring learning algorithms to implement these pressure patterns as a recognition tool.
Lawson et al. [48] has also proposed olfactory systems for dogs’ socialization within a speculative design fiction system they term “the internet of dogs”. These design fiction scenarios allow for dogs to use smell as a communication system by focusing on the production and identification of odours, suggesting that dogs’ primary sense of smell could be used as a CAPTCHA within this “internet of dogs”. During this work Lawson et al. [48] noted previous speculative designs that proposed the use of dogs as olfactory authentication mechanisms within ordinary technology systems (e.g., ATMs), questioning the user-centric values of future olfactory technology.
Moving away from dogs, Kobayashi et al. [93] describe a system for human-animal interaction in wild environments, for deer. The proposed system consists of a remote-controlled rotating table that the user manipulates using a screen interface. The table, surrounded by surveillance cameras, is located in a wild environment, and has a deer cracker made of rock salt placed upon the table. The system is intended so that the smell of the rock salt would attract wild deer, creating a remote interactive experience with wildlife for the human user.
In summary, olfactory technologies as input interactive mechanisms in ACI have currently only been studied within service dog interfaces and speculated about with dogs and deer. The exploration of olfactory technologies for animals introduces many unknown questions, not only about the animals’ interpretation of the interface, but, as Lawson et al. [48] expresses, about “what information the dog has understood, or transmitted”. As technology develops further in this field, so will the variety of animal users that this can be applied to, and the instances in which this can be used grow, even if we do not completely understand the animals’ reasoning fully.

3.5. Screen Technology

Whilst many novel interface devices are used in ACI, the classic visual and touchscreen interfaces contribute a large proportion of animal-computer research. Research in this area has included using tablets as UIs for dogs watching videos [44], screens for remote notification systems [8] (Figure 7), tablet games for cats [3] and orangutans [101] (Figure 8), wall interactive devices for pigs [94] and investigations on the usability of screens for dogs [31].
Screens can be configured solely as output technologies or, more recently, as input/output using a stylus or touch. Many enrichment activities conducted at zoos and sanctuaries with orangutans have involved the use of touchscreens or tablets, such as in the Apps for Apes project [95]. This project has also been used in several zoos around the world, including with orangutans at Melbourne Zoo [101]. The most common scenario for this sort of interaction is where a human keeper holds the tablet in front of the apes’ enclosure so that an orangutan can touch the screen with its digits through the mesh. The human keeper typically encourages the orangutan to engage with the different apps offered in the touchscreen (Figure 8). Orangutans have also been shown to be able to use visual touchscreen interfaces with a stylus [33] within a musical preference study. In this work, orangutans could use a branch within their enclosure as a stylus to select one of the two halves of a screen, and depending on the side chosen, a different musical piece was played. Importantly, within this study it was found that orangutans often rejected the technology. Allowing natural interactions with technology exposes many interesting things, including non-use. Wirman [102,103] has extensively explored how orangutans would naturally interact with screen-based technologies, listing all the different ways orangutans explored the tablets or screens they had lying around their environment. These included not only touching the screen with their digits (fingers), but also using sticks, licking with their tongues, and pouring liquids over [103]. Another example is the work conducted by Perdue et al. [42], who installed a touchscreen in a naturalistic tree structure inside the orangutans enclosure at Zoo Atlanta. This work studied the effect of the touchscreen in orangutans’ behaviours and also assessed the human visitor perceptions of the animal-computer interaction. At Indianapolis Zoo, touch panels have been used with orangutans to provide them with enrichment activities while at the same time allowing for the study of cognitive abilities in different tasks [97]. These authors also created an installation in which humans could interact, and play together, with the orangutans in a shared touch panel (Figure 9).
Playful interactions between animals and screen devices within ACI have also been studied for non-primate species, including pigs, cats and dogs. Alfrink et al. [94] proposed a novel interface for pigs’ enrichment, in which pigs and humans could remotely interact via a giant touchscreen located in the pigs’ enclosure. The human would remotely control a visual element that would appear in the pigs’ screen, so that the pigs could follow it and touch it with their noses creating a combined way for humans and animals to use a screen interface together. Building from this human-animal screen interaction, Westerlaken & Gualeni [3] developed and evaluated a tablet-based game for cats coined Felino. Felino was designed following an animal-centred perspective, in which the interaction could be adapted on the human side towards the ongoing perceived experience of the cat players. Baskin et al. [44] have studied humans’ perception of dogs’ interactions with tablets considering the screen device as an output interface in which dogs could watch videos. Their study built from Hirskyj-Douglas and Read’s [124] work on increasing human perception of dog behaviours with screens using the Dog Information Sheet (DISH). Baskin et al.’s [44] results acknowledged that not all perceived interactions could be considered playful and that careful consideration needed to be taken into account when designing this kind of interactive experiences. This issue has previously been highlighted in Lawson et al.’s (2015) work on speculative design with dogs and cats where it was found the human would often trust the judgement of the technology over scientific judgement.
Zeagler et al. [104] examined touchscreen interfaces with dogs for an alert interface and pointed out that affordances should be investigated to make touchscreens more usable for these animals i.e., the appropriate use of colour and understanding spacing between activation ‘dots’. More recently, Zeagler et al. [8] presented work around the training methods of implementing these systems with dogs using touchscreen (nose) interfaces. Zeagler et al. [8] sought to train dogs to connect two dots on a touch screen interface by firstly training a dog to touch a single dot and then training the dog to slide the nose between two dots to create a dog alarm system similar to Robinsons et al. [64] tangible work mentioned in Section 3.2). Their work presented guidelines for future touch screen interfaces around the type of interfaces (non-projection monitors), the target distance and size (at least 3.5 inches) and on the best training paradigms (shaping for training touch and backchaining for sequential tasks) towards getting dogs to achieve the best behaviour modification required to use the technology. Importantly, Zeagler et al.’s [8] work found that first contact touch screen interfaces are easier for dogs to use and understand than lift-off interactions adding towards the design considerations and training methods in the space of dog screen interfaces. The authors sought to further explore this space through fully training dogs to use such screen systems to ‘call for help’.
Whilst touchscreen design is clearly interesting, ACI has still not really understood the extent to which animals attend to, and can interpret, what is being shown on screen-based displays. Dogs’ attention to screens has been previously explored in studies in animal behaviour that have tracked vision [110,112] or touchscreens [100] with static images. Extending this study into moving media is beginning to be explored within ACI through workaround screen interaction with artificial presences and virtual reality systems [98] and with methods to analyse multiple screen systems [31]. Hirskyj-Douglas et al. [31] created a method to test a dog’s viewing habits, favouring and following between three screens with initial findings indicating that dogs do have a media preference but also low attention times with screen devices. Hirskyj-Douglas et al.’s [31] core contribution within screen devices however, was that their method allowed for dogs to not attend to screens unlike previous work [110,112], echoing out Ritvo and Allison’s [33] findings of animals rejecting technology.
Ohta et al.’s [98] work in progress research plans to use interactive video interfaces to investigate the visual feedback loop (with partial depth cue perceptions) effect on animals’ behaviour to investigate the animal attachment between dogs and robots. This research builds on Kerepesi et al. [96] previous work on visual communicative signals between dogs and humans and cats and humans, and Pongracz et al. [99] work on projecting human images to signal dogs. Wallis et al. [100] take these ideologies further by seeking ways for screens to provide mental stimulation for aging dogs in an animal welfare stance. The mental stimulation is designed by means of touch screen discrimination between two objects giving positive or negative feedback for cognitive enrichment. Delineating from this, Hirskyj-Douglas & Read [53] built methods for dog driven screen devices for dogs in their home by using IR proximity sensors to detect a dog and displaying media o the dog when present. They used this method as a way of flipping the paradigm on normal media devices by allowing the dog to control, and choose, if they wished to use the system.
There is therefore, a growing, nonetheless limited, body of research, as demonstrated above, to investigate and map animals’ requirements towards visual interfaces. One way that requirement gathering, and evaluations have been conducted with visual interfaces in HCI and animal-technologies is through tracking technologies, as will be described in Section 3.6. Another core area of discussion regarding screen technologies is whether the interaction with the interface remains user-centric from the animal perspective depending on its species. For example, screen technologies for zoo enrichment for great apes do not usually allow the animal to hold the device, and screen technologies for animals currently normally require training influencing the animals’ ordinary behaviours. Nevertheless, screen technologies have been one of the most researched areas within ACI, with the flexibility and prolific occurrence of these interfaces offering insightful results to inform the field of ACI and HCI for unordinary users of screen systems. As new perspectives and methodologies are incorporated, future applications and explorations on these devices remain to be discovered.

3.6. Tracking Technologies

The notion of eye tracking, as a way to examine focus, has been around since the 1800s where people conducted eye movement studies from observations with Edmund Huey progressing the field in 1908 using contact lenses on the subject’s eye with a hole for the pupil and the use of aluminium pointers [125]. In the late 1990s early 2000s, eye tracking technology was expanded towards animal users, focusing mostly on primates and dogs; some of this work involved surgical interventions [126]. Body, face, eye and gaze positioning have played a part in understanding human and animal behaviour in ACI through tracking gaze [112], body posture [51,114] and automated face reactions [127] similarly to HCI [128,129]. The advancements made in HCI tracking technology have not yet been fully exploited in ACI technologies, but there is an increasing corpus of ACI studies regarding animal’s tracking in horses [108,109], cats [51,130] and dogs [110,112,114] (Figure 10).
Williams et al. [110] wanted to increase spatial accuracy for laboratory settings by using mobile head mounted, video based, eye-tracking system achieving in their work an accuracy of 2–3°. Sompii et al. [112] took a different approach than Williams et al. [110], by, instead of training a dog to wear a mounted system, training a dog to rest its head upon a headrest to achieve contact-free eye movement tracking. Unlike Williams et al. [110], Sompii et al. [112] used pictures rather than treat location tracking. Sompii et al.’s [112] research provided evidence that dogs focus their attention on informative regions of the images where their gaze fixation depended upon the images category (human, dog, shape and letter). This discrimination of images lead to suggestions that dogs can discriminate images of different categories corresponding with Farago et al. [131] who found that dogs consider natural objects more interesting than abstract ones. Somppi et al. [112] did comment however, that they cannot yet draw any conclusions as to whether the attention of dogs was directed towards stimulus features or semantic information or a mixture of both opening up questions in animal tracking around the impact of the complexity/simplicity of the image in regard to findings. This work did delineate towards species-dependant behaviour [132,133] when viewing faces towards a more natural setting moving Williams et al.’s [110] goal of naturalistic tracking forward. Research into animals’ cognitive processing of technology is particularly needed in animals where welfare is of concern because they cannot vocalise opinions and choices [9]. Animals can be trained to use tracking systems [112] or can be tracked wearing head-mounted systems [110], but both these strategies are known to influence their ordinary behaviour, which is the very thing, ironically in these studies, that researchers are typically aiming to measure as noted in Hirskyj-Douglas et al. [31] work mentioned in Section 3.5.
The constraints and difficulties of tracking technologies that limit the animals’ natural behaviours leave a space open within animal-computing to draw back to the original observational tracking methods in HCI to allow animals to explore technology in ordinary ways, merging early human methods with current usability methods. ACI has recently proposed image-based-human-interpreted recognition systems with horses [108,109], orangutans [111], giraffes [105], cats [51,130] (Figure 11) and dogs [107,114].
These non-intrusive tracking systems vary in how they operate with some using image shape recognition [51], feature and posture recognition [51,107,109,114], motion recognition [111], proximity [53], and point recognition [110,112]. Pons et al. [51] used a Microsoft Kinect depth sensor facing down from the ceiling to record cats’ naturally behaviour when playing. This data was used to create a non-wearable tracking system capable of recognizing cats’ postures (sitting, semi-sitting, walking, standing, jumping and turning), body parts (head, body, tail) and orientation (Figure 11). The tracker used the average area of the cat’s contour, number of pixels and average depth for each cluster to classify the image. The performance of this non-wearable tracker has been extensively analysed in a follow-up study [130] in which authors compared several machine learning classifiers and a greater set of features for the classification.
Building from Pons et al. [51], Mealin et al. [114] (Figure 12) used a Microsoft Kinect depth sensor as well for posture and feature recognition for dogs. Mealin et al. [114] used average depth values and aspect ratios of bounding boxes around the animal, rather than clustering features, for the classification. Both these systems, however, are still semi supervised, where in the case of Mealin et al. [114] multiple images of the background must be included within the training data set to gather better depth reading when comparing singular images, simplifying background separation from the dog. This requires an expert user to implement these systems. Several works, especially in the area of animal behaviour, have used just colour cameras to detect animals’ shapes and track their movement without any posture detection, such as with pigs [115], dogs [113], chickens [11] or mice [106].
North et al. [108,109] have proposed, and are currently developing, a video-based automated behaviour identification software tool for observations of both horse-to-horse and horse-to-human interaction coined HABIT. In 2017, North proposed to extract salient features, such as the horse’s ears, in order to start the detection of the horse in real scenarios and from a variety of viewpoints. Also taking a very animal-centric approach, Webber et al. [111] created and evaluated a tracking system for orangutans in a zoo environment. The interactive system consisted of projecting images on the ground of the orangutans’ enclosure, while a depth sensor tracked the movement and touch of the orangutans over the projections. This system aimed for enrichment purposes allowed a more natural exploration of the technology from the animals’ point of view. Another approach, proposed by Dong et al. [105] for zoo environments, is the use of a single thermal vision camera for giraffe identification and tracking in an unconstrained environment. This work detects giraffes within the image through their body temperature and distinguishes them from other animals by applying machine learning techniques. This proposed system is capable of tracking the movement patterns of giraffes within their enclosure during day and night, allowing for 24-h monitoring.
The space for tracking animals is sparse but growing towards untrained and unsupervised off-body systems, as much as vibrotactile systems are growing into on the body systems. From this review, it seems that whilst tracking technology has a growing use within ACI, particularly in image and posture recognition, there is still a missing gap of research that needs to be conducted to ensure its usefulness towards the animal’s normal behaviour. Specifically built animal contact-free and training-free versions of eye tracking systems still do not exist. In addition, complex behaviours cannot be inferred from basic posture tracking, which makes it difficult to build a system capable of fully reacting to the animals’ intentions. There is clearly a space open within ACI for studying non-wearable tracking of animals’ behaviour, in which interdisciplinary approaches and knowledge from animal behaviour are essential in this matter.

4. Future of Technologies within ACI

As this field is a relatively new area of investigation, there is much further work that needs to be done to iteratively test some of the technological interfaces, designs, systems and models outlined within with a larger dataset: of varying animals and breeds at different ages in different contexts and situations. In addition, several works within ACI that have not been covered in this literature review have focused on methodologies for, and the ethics of, designing and building technology for and with animals—this leaves an open body of work. As the technology around ACI grows, the results will inform the development and formal definition of specific methods for this discipline, drawing from conjoining fields such as HCI and animal behaviour. In return, these methods will greatly help to advance the development of new technological devices and systems with a further grounded animal-centred perspective. The technology informed results described above are placed within the time and context of the current technology, which with advancements, turning towards the pet, zoo and farm specialties and computer interaction interfaces, a few of the obtained results may significantly change over time.
The research works that have been discussed above within this literature review have been received with interest within the community and where presented; however, the acceptance of these views varies dependent upon the community’s set paradigms. The ACI community embraces the animal-centric technologies and results gathered as the philosophical approach fits within the community paradigms laid out within the ACI Manifesto [9]. To the HCI community, they embrace the recognition of method transferences, particularly between animals, children and other non-verbal users to elicit new methods [47,134]. To the animal research community, the animal-centric approach is often considered in a more flexible way. For example, training is frequently carried out and the initial results are considered as a source of information, in contrast with more goal oriented systems [63,65]. To animal owners and those who look after animals, the potential outcomes of the ACI field hold promise for entertainment and enrichment possibilities as shown through the recent growth in pet technological products towards fostering monitoring [14] and playful robotic systems [135]. However, the animal-centred perspective of ACI studies is not always materialized in industry designs, which requires for careful considerations when deploying pet technology at homes, into our zoos and farms. In this regard, both industry and the ACI community still need to find a way to complement each other and work together towards ensuring animal welfare.
The rest of this section sets out the research agenda for future work which would have implications towards each of the communities outlined above. This research agenda forms an important contribution of this literature review and is drawn out through the noticed gaps informing both novice and experienced researchers within ACI.

4.1. Animal-Driven Devices for Enrichment and Work

Within modern society many animals, such as pets and farm animals, often spend time alone. Devices that are designed for animals to support these alone periods could potentially provide a usable platform for developing enrichment devices. This is not to state that enrichment is a sole process; technology could also align with the human-animal bond to enrich that play, and there are already several works and projects devoted to this line of research [51,102,111]. Most of the works proposing enrichment or playful technology for animals have been focused on pets and zoo animals, while studies centred on farm animals have been mostly oriented towards welfare and data exchange (see Figure 2). However, farm animals which are arguably the most mistreated animals within modern society could also greatly benefit from playful technological interventions, which can certainly be another way of improving their welfare. The ACI community is slowly beginning to explore these enrichment opportunities [136].
One of the most studied species within ACI is dogs, with one of the biggest growth areas of ACI being devices to support a dog’s work. This has motivated that, in some ACI technologies, rather than the human being in control of the system, the animal is trained towards using the system. This training varies from the animal making simple behavioural choices of activating a device when faced with certain situations [8,30,64] through to being trained to sort something [37,41]. On the other end of the scale, there are less constructed and more exploratory interactions where an animal is presented with technologies and their behaviours documented, such as interactions with robots [57,63,65]. A future step could be taken in ACI to deeply explore the idea of animal centred methods [9,121] to allow technology to be shaped around animals’ affordances. It is in this way that the technology used will become yet even more suitable for the animal. Building from previous investigations in button systems [21], enrichment for zoo enclosures [118], or screen systems [31], one of the promising mechanisms to explore animal-centeredness could be the design and development of animal-activated systems. For example, tracking systems could be useful to account for natural occurrences of animals initiating an interaction through observation and learning.

4.2. Investigating What Is Interactivity in Animal-Computer Interaction (ACI)

At the beginning of this literature review, comments were given on the feedback loop between animals and systems. One of the key areas for ACI is to investigate what it means to have interaction between animals and technology. For instance, depending on the focus of the research, animals have been considered to be wearers, as in the case of biotelemetry devices, to fully interactive users, such as within some playful technologies. However, there is a whole range of interactivity levels, participation models and discussions around whether an animal is actively or inadvertently using some technology, and these have each yet to be scoped within ACI. As an example, haptic interfaces such as vibrotactile vests provide an interaction that the animal receives, with the animal initially not knowing what caused the system to be triggered and instead learning the triggering mechanism through training schemes focused around shaping the animals’ behaviour. There are also cases, such as Lee et al. [11] or Van Eck & Lamers [84], where the animal receives the output produced by a human user, without any control on the situation, which might not arguably be an interaction from the animal’s stance. On the other side of the spectrum, involuntary participation of the animals within systems, such as the idea of consent within ACI [9], are being introduced into ACI where the animal is free to choose to interact or not [31].
Currently, little is known about the reasoning behind an animal’s interactions with technology (the gulf of execution) and what the term ‘interaction’ means in the field of ACI, although this is beginning to be explored and discussed [32]. This could be further investigated through looking at what an animal’s usability is through their attention towards technology and behaviour while using systems, and then aligning systems to allow for these actions. Long-term deployments of the technology would also help to observe these spontaneous behaviours in a less-constrained scenario than prefixed experimental sessions. All these elements could help further build up a stronger picture of interactivity in ACI.

4.3. Ethics and Agency in Animal-Computer Interaction (ACI)

Whilst this literature review does not aim to tackle in depth the notion of ethics and agency within animal-computing systems, these attributes are inherently interwoven into technology interfaces used in ACI. As Mancini [20] advocates in her welfare-centered ethics framework there is a case in ACI for moving beyond existing regulations and guidelines in an animal-centric approach. This framework, like Hirskyj-Douglas & Read’s ethical framework for dogs [137], focuses on obtaining mediated and contingent consent from the animal. Alternatively, Väätäjä [138] takes another viewpoint upon ethical technological implications for animals seeing ethics as a welfare issue, and therefore motivating her framework within the need for mediating relatedness and intimacy with technology drawing from the ‘3R’s’ in science of replacement, reduction and refinement [139].
Intrinsically tied into this discussion is consent and animal agency, that is the ability of the animal to act independently and to make their own free choice. Whilst Mancini [20] argues that technology blurs the boundary between animal and human agency, as the lines between user and interactors are obscured, Hirskyj-Douglas and Read [53] advocate for animal agency through exploring technology interventions that aim towards animal-driven devices. This stance is in an aim towards free choice (consent) from the animals themselves to give the animals further agency within technological interventions, which situates this conversation between the ethical morality of working with animals and the human drive to gather data [137].
Drawing back to technological instances, whilst ACI has the potential to significantly contribute towards forming ethics of animals’ situation within and using technology, the current landscape of computer systems, as explored above, has varying degrees of ethical implications and agency for animals within technologies. For instance, not all the technologies explored within this manuscript allow for consent, nor understand consent in the same way. Thus, the future of technological devices for animals in computer systems is closely drawn in relation to ethical and agency propositions towards methodological conditions for both shaping future technologies and fostering animal-centric approaches. This creates a juncture in ACI to explore giving the animals varying degrees of agency within technology localities to shape the future of ACI systems under a common understanding of welfare implications and ethics.

4.4. Moving Beyond the Human-Animal-Computer Void

Perhaps due to the novelty of the field and the fact that the ACI community still has more questions than answers, most of the works reviewed in this manuscript are focused on a single animal interacting with a system. Several works have also addressed the role of technology in human-animal relationships, as well as human-mediated systems for animal use. As suggested in Weilenmann & Juhlin [87], Asplin & Juhlin [27] and in this literature review, there is an opportunity to move ACI research from a sole animal-computer interaction and human-animal computer mediated interaction towards animal-animal interaction by means of technology. Whilst it is unknown at this stage what animals need and want in regards to computer technology for animal-to-animal communication, it is only through exploring these areas that new light can be shed. Animals are often put within situations which are not ideal for their well being due to human circumstances i.e., shelters, rescues and left in homes, which isolate their contact from other animals where technology interventions could intervene aiding both the human career and the animal themself.
The animal-to-animal internet has been used on a technological level for internet communication sharing capability as a method to maximise technology performance by allowing the systems worn by different pigeons to communicate [140]. This has been irrespective of the animal to animal interaction themselves however. Whilst Reiss et al. [141] have suggested the interspecies internet to allow all animals to communicate online and Lawson et al. [48] suggests the internet of dogs via olfactory systems, this ideology has yet to be brought into fruition beyond ideologies and design fiction to connect animals mediated through computer systems. Perhaps this is the next step for animal-centricity: by building up technology systems to allow the internet of animals to evolve building ACI solely off animal requirements. This would be allowing, in some sense, to surpass the barriers within ACI put up by humans. This narration includes the exploration of multispecies computer mediated interactions, i.e., animal-animal interactive systems.
It is acknowledged however that even within animal-animal technology systems that are animal-centred the human is inherently part of the interaction paradigm as they are the ones building, initially designing and informing part, if not all, of the system(s) and interpreting the behaviour as an output. However more efforts could be placed towards having the same responsibilities and actions for both the human and the animal during the interaction phase, giving more voice and control to the animal to guide the interaction and in defining how they want to interact with a human [2,102]. Whilst it is understood that currently the human-animal ACI user space has not yet fully been explored, this is not necessarily a prerequisite to animal-animal exploration but is sure to benefit the diverse multispecies use.

5. Conclusions

This thematic literature review provides an outline of the ACI field concerning the technologies involved. As demonstrated in this review, one method for analysing these technologies is through their use; tangible and physical, haptic and wearable, olfactory, screen technologies and tracking technologies. This is just one way of exploring current literature, yet it is necessary to approach the foundational work in the field in order to reflect on the acquired knowledge. We hope this framework will help building up new perspectives over ACI technologies and help to identify promising opportunities for further development. Many of the technologies explored above have potential within various areas of ACI from assisting with human health technologies, giving animals more control, enabling data exchange between animals and humans, assisting working and service animals and monitoring animals in both our own environments and the natural habitat.
There are currently few, but a growing number of, technologies in ACI often stemming from and adapted from HCI and ethology. This literature has identified technologies used in ACI to support both human to animal, animal to human and animal to robotic communication via computerised systems. In particular, this thematic literature review notes spaces within tangible and physical, haptic and wearable, olfactory, screen interfaces and tracking technologies. In tangible and physical technologies, a change was noticed from traditionally investigating cognition and behaviour towards looking more closely at the interaction paradigm, with particular systems expansion in the zoo and pet fields for new technologies. Gaps were observed within tangible and physical systems to investigate new training paradigms so animals can use systems more efficiently, and to look further at how animals respond to the interfaces that they hold and use. For wearable systems it was noted that ACI is expanding into wearable systems in order to quantify the animal’s behaviours and actions, providing supplementary information away from tracking systems. Haptic systems are expanding further into providing a communication mechanism between the animal and the human to investigate more deeply the feedback loop. Another space identified within these two fields are in relation to pairing these systems together to provide quantified and enhanced communication systems. In olfactory technologies, currently only dogs and deer have been investigated, and only in one implemented system, leaving spaces open to investigate this interaction mechanism in other species and further for dogs. In visual screen interfaces, there is a gap to map the animals’ requirements towards these interfaces, as mentioned through tracking technologies. It was also noticed the need for these screen technologies to be more user-centric and to vary by species drawing from the field of HCI. Lastly, in tracking technologies there is a gap and a move in ACI tracking methods towards unsupervised and untrained off the body instances, particularly towards its usefulness for the animal user and to quantify the animals’ behaviour. As suggested throughout the various sections of this literature review, there is a need for a further interdisciplinary approach within ACI technologies to ground the field forward within animal cognition and behaviour not just within the computer interaction space.
There are, like Jukan et al. [18] mentions, technical and economic challenges to overcome, but these have to be frame worked and mapped together to create a more foundational field knowledge from which to build for both new and expert researchers and developers of ACI technologies. As a research field, ACI is embedded within the research mentality towards positive animal welfare. It is instead a question of the boundary of how much implementation of technology, the application of these developments and the interaction paradigms that need to be carefully explored.

Acknowledgments

The work of J.J. and P.P. is supported by the European Development Regional Fund (EDRF-FEDER) and Spanish MINECO with Project TIN2014-60077-R. P.P. also receives support from a national grant from the Spanish MECD (FPU13/03831). The authors would also like to thank Patrizia Paci for her contribution during the beginning stages of this review.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Dixon, A.K. Ethological Strategies for Defence in Animals and Humans: Their Role in Some Psychiatric Disorders. Br. J. Med. Psychol. 1998, 71, 417–445. [Google Scholar] [CrossRef] [PubMed]
  2. Pons, P.; Jaen, J.; Catala, A. Envisioning Future Playful Interactive Environments for Animals. In More Playful User Interfaces; Nijholt, A., Ed.; Springer: Singapore, 2015; pp. 121–150. [Google Scholar] [Green Version]
  3. Westerlaken, M.; Gualeni, S. Felino: The Philosophical Practice of Making an Interspecies Videogame. In Proceedings of the Philosophy of Computer Games Conference, Istanbul, Turkey, 13–15 November 2014; pp. 1–12. [Google Scholar]
  4. Resner, B. Rover@Home: Computer Mediated Remote Interaction between Humans and Dogs. Master’s Thesis, Massachusetts Institute of Technology, Cambridge, MA, USA, 2001. [Google Scholar]
  5. Mancini, C.; van der Linden, J. UbiComp for Animal Welfare: Envisioning Smart Environments for Kenneled Dogs. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing, Seattle, WA, USA, 13–17 September 2014; pp. 117–128. [Google Scholar]
  6. Morrison, A.; Møller, R.H.; Manresa-Yee, C.; Eshraghi, N. The Impact of Training Approaches on Experimental Setup and Design of Wearable Vibrotactiles for Hunting Dogs. In Proceedings of the Third International Conference on Animal-Computer Interaction—ACI ’16, Milton Keynes, UK, 15–17 November 2016; ACM Press: New York, NY, USA, 2016; pp. 1–10. [Google Scholar]
  7. Robinson, C.; Mancini, C.; van der Linden, J.; Guest, C.; Harris, R. Empowering Assistance Dogs: An Alarm Interface for Canine Use. In Proceedings of the Intelligent Systems for Animal Welfare, London, UK, 1–4 April 2014; pp. 1–4. [Google Scholar]
  8. Zeagler, C.; Zuerndorfer, J.; Lau, A.; Freil, L.; Gilliland, S.; Starner, T.; Jackson, M.M. Canine Computer Interaction: Towards Designing a Touchscreen Interface for Working Dogs. In Proceedings of the Third International Conference on Animal-Computer Interaction—ACI ’16, Milton Keynes, UK, 15–17 November 2016; ACM Press: New York, NY, USA, 2016; pp. 1–5. [Google Scholar]
  9. Mancini, C. Animal-Computer Interaction: A Manifesto. Mag. Interact. 2011, 18, 69–73. [Google Scholar] [CrossRef]
  10. Mancini, C.; Juhlin, O.; Cheock, A.D.; van der Linden, J.; Lawson, S. Animal-Computer Interaction (ACI): Pushing Boundaries beyond “Human”. In Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, Foundational, Helsinki, Finland, 26–30 October 2014; pp. 833–836. [Google Scholar]
  11. Lee, S.P.; Cheok, A.D.; James, T.K.S.; Debra, G.P.L.; Jie, C.W.; Chuang, W.; Farbiz, F. A Mobile Pet Wearable Computer and Mixed Reality System for Human–poultry Interaction through the Internet. Pers. Ubiquitous Comput. 2006, 10, 301–317. [Google Scholar] [CrossRef]
  12. Tan, R.T.K.C.; Cheok, A.D.; Peiris, R.L.; Wijesena, I.J.P.; Tan, D.B.S.; Raveendran, K.; Nguyen, K.D.T.; Sen, Y.P.; Yio, E.Z. Computer Game for Small Pets and Humans. In Entertainment Computing—ICEC 2007: 6th International Conference; Springer: Berlin/Heidelberg, Germany, 2007; pp. 28–38. [Google Scholar]
  13. Wirman, H.; Zamansky, A. Toward Characterization of Playful ACI. Interactions 2016, 23, 47–51. [Google Scholar] [CrossRef]
  14. FitBark Dog Activity Monitor: Healthy Together. Available online: https://www.fitbark.com/ (accessed on 20 March 2018).
  15. CleverPet. CleverPet: Engage Idle Paws. Available online: https://clever.pet/ (accessed on 6 April 2018).
  16. Wemelsfelder, F.; Hunter, T.E.A.; Mendl, M.T.; Lawrence, A.B. Assessing the “Whole Animal”: A Free Choice Profiling Approach. Anim. Behav. 2001, 62, 209–220. [Google Scholar] [CrossRef]
  17. Mancini, C.; Lawson, S.; van der Linden, J.; Häkkilä, J.; Noz, F.; Juhlin, O. Animal-Computer Interaction SIG. In Proceedings of the 2012 ACM Annual Conference Extended Abstracts on Human Factors in Computing Systems Extended Abstracts, Austin, TX, USA, 5–10 May 2012; pp. 1233–1236. [Google Scholar]
  18. Jukan, A.; Masip-Bruin, X.; Amla, N. Smart Computing and Sensing Technologies for Animal Welfare: A Systematic Review. ACM Comput. Surv. 2017, 50, 1–27. [Google Scholar] [CrossRef]
  19. Jones, M.K. Human-Computer Interaction: A Design Guide; Educational Technology: Englewood Cliffs, NJ, USA, 1989. [Google Scholar]
  20. Mancini, C. Towards an Animal-Centred Ethics for Animal-Computer Interaction. Int. J. Hum. Comput. Stud. 2017, 98, 221–233. [Google Scholar] [CrossRef]
  21. Mancini, C.; Li, S.; O’Connor, G.; Valencia, J.; Edwards, D.; McCain, H. Towards Multispecies Interaction Environments: Extending Accessibility to Canine Users. In Proceedings of the Third International Conference on Animal-Computer Interaction—ACI ’16, Milton Keynes, UK, 15–17 November 2016; ACM Press: New York, NY, USA, 2016; pp. 1–10. [Google Scholar]
  22. Darwin, C. On the Origin of Species by Means of Natural Selection, or the Preservation of Favoured Races in the Struggle for Life; John Murray: London, UK, 1859. [Google Scholar]
  23. Hodgenboom, M. The Traits That Make Human Beings Unique. BBC. Available online: http://www.bbc.com/future/story/20150706-the-small-list-of-things-that-make-humans-unique (accessed on 20 March 2018).
  24. Buchanan, R. Design Research and the New Learning. Des. Issues 2001, 17, 3–23. [Google Scholar] [CrossRef]
  25. Dubberly, H.; Pangaro, P.; Haque, U. What Is Interaction? Are There Different Types? Interactions 2009, 16, 69–75. [Google Scholar] [CrossRef]
  26. Goode, D. Playing with My Dog Katiee: An Ethnomethodological Study of Dog—Human Interaction; Purdue University Press: West Lafayette, IN, USA, 2006. [Google Scholar]
  27. Aspling, F.; Juhlin, O. Theorizing Animal–computer Interaction as Machinations. Int. J. Hum. Comput. Stud. 2017, 98, 135–149. [Google Scholar] [CrossRef]
  28. Goffman, E. Strategic Interaction; University of Pennsylvania Press: Philadelphia, PA, USA, 1971. [Google Scholar]
  29. Norman, D.A. The Design of Everyday Things. Available online: http://www.nixdell.com/classes/HCI-and-Design-Spring-2017/The-Design-of-Everyday-Things-Revised-and-Expanded-Edition.pdf (accessed on 18 April 2018).
  30. Geurtsen, A.; Lamers, M.H.; Schaaf, M.J.M. Interactive Digital Gameplay Can Lower Stress Hormone Levels in Home Alone Dogs: A Case for Animal Welfare Informatics. In 14th International Conference on Entertainment Computing (ICEC 2015); Chorianopoulos, K., Divitini, M., Hauge, J.B., Jaccheri, L., Malaka, R., Eds.; Lecture Notes in Computer Science; Springer International Publishing: Cham, Switzerland, 2015; Volume 9353, pp. 238–251. [Google Scholar]
  31. Hirskyj-Douglas, I.; Read, J.C.; Cassidy, B.C. A Dog Centred Approach to the Analysis of Dogs’ Interactions with Media on TV Screens. Int. J. Hum. Comput. Stud. 2017, 98, 208–220. [Google Scholar] [CrossRef]
  32. Freil, L.; Byrne, C.; Valentin, G.; Zeagler, C.; Roberts, D.; Starner, T.; Jackson, M. Canine-Centered Computing. Found. Trends® Hum. Comput. Interact. 2017, 10, 87–164. [Google Scholar] [CrossRef] [Green Version]
  33. Ritvo, S.E.; Allison, R.S. Challenges Related to Nonhuman Animal-Computer Interaction: Usability and Liking. In Proceedings of the 2014 Workshops on Advances in Computer Entertainment Conference—ACE ’14 Workshops, Funchal, Portugal, 11–14 November 2014; ACM Press: New York, NY, USA, 2014; pp. 1–7. [Google Scholar]
  34. Britt, W.R.; Miller, J.; Waggoner, P.; Bevly, D.M.; Hamilton, J.A. An Embedded System for Real-Time Navigation and Remote Command of a Trained Canine. Pers. Ubiquitous Comput. 2011, 15, 61–74. [Google Scholar] [CrossRef]
  35. Valentin, G.; Alcaidinho, J.; Howard, A.; Jackson, M.M.; Starner, T. Towards a Canine-Human Communication System Based on Head Gestures. In Proceedings of the 12th International Conference on Advances in Computer Entertainment Technology—ACE ’15, Iskandar, Malaysia, 16–19 November 2015; ACM Press: New York, NY, USA, 2015; pp. 1–9. [Google Scholar]
  36. Majikes, J.; Brugarolas, R.; Winters, M.; Yuschak, S.; Mealin, S.; Walker, K.; Yang, P.; Sherman, B.; Bozkurt, A.; Roberts, D.L. Balancing Noise Sensitivity, Response Latency, and Posture Accuracy for a Computer-Assisted Canine Posture Training System. Int. J. Hum. Comput. Stud. 2017, 98, 179–195. [Google Scholar] [CrossRef]
  37. Mancini, C.; Harris, R.; Aengenheister, B.; Guest, C. Re-Centering Multispecies Practices: A Canine Interface for Cancer Detection Dogs. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems—CHI ’15, Seoul, Korea, 18–23 April 2015; ACM Press: New York, NY, USA, 2015; pp. 2673–2682. [Google Scholar]
  38. Lemasson, G.; Duhaut, D.; Pesty, S. Dog: Can You Feel It? Available online: http://acid.uclan.ac.uk/publications (accessed on 18 April 2018).
  39. Trindade, R.; Sousa, M.; Hart, C.; Vieira, N.; Rodrigues, R.; França, J. Purrfect Crime. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems—CHI EA ’15, Seoul, Korea, 18–23 April 2015; pp. 93–96. [Google Scholar]
  40. Mealin, S.; Winters, M.; Domínguez, I.X.; Marrero-García, M.; Bozkurt, A.; Sherman, B.L.; Roberts, D.L. Towards the Non-Visual Monitoring of Canine Physiology in Real-Time by Blind Handlers. In Proceedings of the 12th International Conference on Advances in Computer Entertainment Technology—ACE ’15, Iskandar, Malaysia, 16–19 November 2015; ACM Press: New York, NY, USA, 2015; pp. 1–8. [Google Scholar]
  41. Johnston-Wilder, O.; Mancini, C.; Aengenheister, B.; Mills, J.; Harris, R.; Guest, C. Sensing the Shape of Canine Responses to Cancer. In Proceedings of the 12th International Conference on Advances in Computer Entertainment Technology—ACE ’15, Iskandar, Malaysia, 16–19 November 2015; ACM Press: New York, NY, USA, 2015; pp. 63:1–63:4. [Google Scholar]
  42. Perdue, B.M.; Clay, A.W.; Gaalema, D.E.; Maple, T.L.; Stoinski, T.S. Technology at the Zoo: The Influence of a Touchscreen Computer on Orangutans and Zoo Visitors. Zoo Biol. 2012, 31, 27–39. [Google Scholar] [CrossRef] [PubMed]
  43. Robinson, C.; Mancini, C.; van der Linden, J.; Guest, C.; Harris, R. Canine-Centered Interface Design: Supporting the Work of Diabetes Alert Dogs. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Toronto, ON, Canada, 26 April–1 May 2014; pp. 3757–3766. [Google Scholar]
  44. Baskin, S.; Zamansky, A. The Player Is Chewing the Tablet! In Proceedings of the 2015 Annual Symposium on Computer-Human Interaction in Play—CHI PLAY ’15, London, UK, 5–7 October 2015; ACM Press: New York, NY, USA, 2015; pp. 463–468. [Google Scholar]
  45. French, F.; Mancini, C.; Sharp, H. Designing Interactive Toys for Elephants. In Proceedings of the 2015 Annual Symposium on Computer-Human Interaction in Play—CHI PLAY ’15, London, UK, 5–7 October 2015; ACM Press: New York, NY, USA, 2015; pp. 523–528. [Google Scholar] [Green Version]
  46. Carter, M.; Webber, S.; Sherwen, S. Naturalism and ACI: Augmenting Zoo Enclosures with Digital Technology. In Proceedings of the 12th International Conference on Advances in Computer Entertainment Technology—ACE ’15, Iskandar, Malaysia, 16–19 November 2015; ACM Press: New York, NY, USA, 2015; pp. 61:1–61:5. [Google Scholar]
  47. Mancini, C. Animal-Computer Interaction (ACI): Changing Perspective on HCI, Participation and Sustainability. In Proceedings of the CHI ’13 Extended Abstracts on Human Factors in Computing Systems, Paris, France, 27 April–2 May 2013; ACM Press: New York, NY, USA, 2013; pp. 2227–2236. [Google Scholar]
  48. Lawson, S.; Kirman, B.; Linehan, C. Power, Participation, and the Dog Internet. Interactions 2016, 23, 37–41. [Google Scholar] [CrossRef]
  49. Paci, P.; Mancini, C.; Price, B.A. Towards a Wearer-Centred Framework for Animal Biotelemetry Designing for Wearability. In Proceedings of the Measuring Behavior 2016, Dublin, Ireland, 25–27 May 2016; pp. 465–469. [Google Scholar]
  50. Paci, P.; Mancini, C.; Price, B.A. Designing for Wearability in Animal Biotelemetry. In Proceedings of the Third International Conference on Animal-Computer Interaction—ACI ’16, Milton Keynes, UK, 15–17 November 2016; ACM Press: New York, NY, USA, 2016; pp. 1–4. [Google Scholar]
  51. Pons, P.; Jaen, J.; Catala, A. Developing a Depth-Based Tracking System for Interactive Playful Environments with Animals. In Proceedings of the 12th International Conference on Advances in Computer Entertainment Technology—ACE ’15, Iskandar, Malaysia, 16–19 November 2015; ACM Press: New York, NY, USA, 2015; pp. 1–8. [Google Scholar]
  52. Winters, M.; Brugarolas, R.; Majikes, J.; Mealin, S.; Yuschak, S.; Sherman, B.L.; Bozkurt, A.; Roberts, D. Knowledge Engineering for Unsupervised Canine Posture Detection from IMU Data. In Proceedings of the 12th International Conference on Advances in Computer Entertainment Technology—ACE ’15, Iskandar, Malaysia, 16–19 November 2015; ACM Press: New York, NY, USA, 2015; pp. 1–8. [Google Scholar]
  53. Hirskyj-Douglas, I.; Read, J.C. DoggyVision: Examining how Dogs (Canis Lupus Familiaris) Interact with Media Using a Dog Driven Proximity Tracker Device. Anim. Behav. Cognit. 2018. (Submitted). [Google Scholar]
  54. Byrne, C.; Zuerndorfer, J.; Freil, L.; Han, X.; Sirolly, A.; Cilliland, S.; Starner, T.; Jackson, M. Predicting the Suitability of Service Animals Using Instrumented Dog Toys. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2018, 1, 127. [Google Scholar] [CrossRef]
  55. French, F.; Mancini, C.; Sharp, H. High Tech Cognitive and Acoustic Enrichment for Captive Elephants. J. Neurosci. Methods 2018, 300, 173–183. [Google Scholar] [CrossRef] [PubMed]
  56. Gergely, A.; Petró, E.; Topál, J.; Miklósi, A. The Emergence of Social Interaction between Dog and an Unidentified Moving Object (UMO). In Proceedings of the 50th Annual Convention of the Society for the Study of Artificial Intelligence and the Simulation of Behaviour, London, UK, 1–4 April 2014. [Google Scholar]
  57. Gergely, A.; Compton, A.B.; Newberry, R.C.; Miklósi, Á. Social Interaction with An “unidentified Moving Object” Elicits A-Not-B Error in Domestic Dogs. PLoS ONE 2016, 11, e0151600. [Google Scholar] [CrossRef] [PubMed]
  58. Gupfinger, R.; Kaltenbrunner, M. Sonic Experiments with Grey Parrots: A Report on Testing the Auditory Skills and Musical Preferences of Grey Parrots in Captivity. In Proceedings of the Fourth International Conference on Animal-Computer Interaction (ACI 2017), Milton Keynes, UK, 21–23 November 2017; pp. 3:1–3:6. [Google Scholar]
  59. Hermans, N.F.H.; Eggen, J.H. Beyond Barriers: Exploring Opportunities of Digital Technology to Encourage Personal Interaction between Captive Orangutans and Zoo Visitors. In Proceedings of the HCI Goes to the Zoo, CHI 2016 Workshops, San Jose, CA, USA, 7–12 May 2016; pp. 1–7. [Google Scholar]
  60. Hu, F.; Silver, D.; Trude, A. LonelyDog@Home. In 2007 IEEE/WIC/ACM International Conferences on Web Intelligence and Intelligent Agent Technology—Workshops; IEEE: Washington, DC, USA, 2007; pp. 333–337. [Google Scholar]
  61. Krebs, B.L.; Watters, J.V. Using Technology Driven Environments to Promote Animal Well-Being in Zoos. In Proceedings of the HCI Goes to the Zoo, CHI 2016 Workshops, San Jose, CA, USA, 7–12 May 2016; pp. 1–5. [Google Scholar]
  62. Pons, P.; Carter, M.; Jaen, J. Sound to Your Objects: A Novel Design Approach to Evaluate Orangutans’ Interest in Sound-Based Stimuli. In Proceedings of the Third International Conference on Animal-Computer Interaction—ACI ’16, Milton Keynes, UK, 15–17 November 2016; ACM Press: New York, NY, USA, 2016; pp. 1–5. [Google Scholar]
  63. Pons, P.; Jaen, J.; Catala, A. Towards Future Interactive Intelligent Systems for Animals: Study and Recognition of Embodied Interactions. In Proceedings of the 22nd International Conference on Intelligent User Interfaces—IUI ’17, Limassol, Cyprus, 13–16 May 2017; ACM Press: New York, NY, USA, 2017; pp. 389–400. [Google Scholar]
  64. Robinson, C.; Mancini, C.; van der Linden, J.; Guest, C.; Swanson, L.; Marsden, H.; Valencia, J.; Aengenheister, B. Designing an Emergency Communication System for Human and Assistance Dog Partnerships. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing—UbiComp ’15, Osaka, Japan, 7–11 September 2015; ACM Press: New York, NY, USA, 2015; pp. 337–347. [Google Scholar]
  65. Westerlaken, M.; Gualeni, S. Becoming with: Towards the Inclusion of Animals as Participants in Design Processes. In Proceedings of the Third International Conference on Animal-Computer Interaction—ACI ’16, Milton Keynes, UK, 15–17 November 2016; ACM Press: New York, NY, USA, 2016; pp. 1–10. [Google Scholar]
  66. Bozkurt, A.; Roberts, D.L.; Sherman, B.L.; Brugarolas, R.; Mealin, S.; Majikes, J.; Yang, P.; Loftin, R. Toward Cyber-Enhanced Working Dogs for Search and Rescue. IEEE Intell. Syst. 2014, 29, 32–39. [Google Scholar] [CrossRef]
  67. Brugarolas, R.; Loftin, R.T.; Yang, P.; Roberts, D.L.; Sherman, B.L.; Bozkurt, A. Behavior Recognition Based on Machine Learning Algorithms for a Wireless Canine Machine Interface. In Proceedings of the 2013 IEEE International Conference on Body Sensor Networks, Cambridge, MA, USA, 6–9 May 2013; pp. 1–5. [Google Scholar]
  68. Byrne, C.; Freil, L.; Starner, T.; Jackson, M.M. A Method to Evaluate Haptic Interfaces for Working Dogs. Int. J. Hum. Comput. Stud. 2017, 98, 196–207. [Google Scholar] [CrossRef]
  69. Carpio, F.; Jukan, A.; Sanchez, A.I.M.; Amla, N.; Kemper, N. Beyond Production Indicators: A Novel Smart Farming Application and System for Animal Welfare. In Proceedings of the Fourth International Conference on Animal-Computer Interaction—ACI2017, Milton Keynes, UK, 21–23 November 2017; ACM Press: New York, NY, USA, 2017; pp. 7:1–7:11. [Google Scholar]
  70. Haladjian, J.; Ermis, A.; Hodaie, Z.; Brügge, B. iPig: Towards Tracking the Behavior of Free-Roaming Pigs. In Proceedings of the Fourth International Conference on Animal-Computer Interaction—ACI2017, Milton Keynes, UK, 21–23 November 2017; ACM Press: New York, NY, USA, 2017; pp. 10:1–10:5. [Google Scholar]
  71. Haladjian, J.; Hodaie, Z.; Nüske, S.; Brügge, B. Gait Anomaly Detection in Dairy Cattle. In Proceedings of the Fourth International Conference on Animal-Computer Interaction (ACI 2017), Milton Keynes, UK, 21–23 November 2017; pp. 8:1–8:8. [Google Scholar]
  72. Jackson, M.M.; Kshirsagar, Y.; Starner, T.; Zeagler, C.; Valentin, G.; Martin, A.; Martin, V.; Delawalla, A.; Blount, W.; Eiring, S.; et al. FIDO—Facilitating Interactions for Dogs with Occupations: Wearable Dog-Activated Interfaces. In Proceedings of the 17th Annual International Symposium on International Symposium on Wearable Computers—ISWC ’13, Zurich, Switzerland, 8–12 September 2013; ACM Press: New York, NY, USA, 2013; pp. 81–88. [Google Scholar]
  73. Jackson, M.M.; Valentin, G.; Freil, L.; Burkeen, L.; Zeagler, C.; Gilliland, S.; Currier, B.; Starner, T. FIDO—Facilitating Interactions for Dogs with Occupations: Wearable Communication Interfaces for Working Dogs. Pers. Ubiquitous Comput. 2015, 19, 155–173. [Google Scholar] [CrossRef]
  74. Ladha, C.; Hammerla, N.; Hughes, E.; Olivier, P.; Ploetz, T. Dog’s Life: Wearable Activity Recognition for Dogs Cassim. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing—UbiComp ’13, Zurich, Switzerland, 8–12 September 2013; ACM Press: New York, NY, USA, 2013; pp. 415–418. [Google Scholar]
  75. Mancini, C.; van der Linden, J.; Bryan, J.; Stuart, A. Exploring Interspecies Sensemaking: Dog Tracking Semiotics and Multispecies Ethnography. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing—UbiComp ’12, Pittsburgh, PA, USA, 5–8 September 2012; ACM Press: New York, NY, USA, 2012; pp. 143–152. [Google Scholar]
  76. Mealin, S.; Foster, M.; Walker, K.; Yushak, S.; Sherman, B.; Bozkurt, A.; Roberts, D.L. Creating an Evaluation System for Future Guide Dogs: A Case Study of Designing for Both Human and Canine Needs. In Proceedings of the Fourth International Conference on Animal-Computer Interaction—ACI2017, Milton Keynes, UK, 21–23 November 2017; ACM Press: New York, NY, USA, 2017; pp. 1–6. [Google Scholar]
  77. Paci, P.; Mancini, C.; Price, B.A. The Role of Ethological Observation for Measuring Animal Reactions to Biotelemetry Devices. In Proceedings of the Fourth International Conference on Animal-Computer Interaction—ACI2017, Milton Keynes, UK, 21–23 November 2017; ACM Press: New York, NY, USA, 2017; pp. 1–12. [Google Scholar] [Green Version]
  78. Paldanius, M.; Kärkkäinen, T.; Väänänen-Vainio-Mattila, K.; Juhlin, O.; Häkkilä, J. Communication Technology for Human-Dog Interaction: Exploration of Dog Owners’ Experiences and Expectations. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, 7–12 May 2011; ACM Press: New York, NY, USA, 2011; pp. 2641–2650. [Google Scholar]
  79. Pawtrack. GPS Cat Tracking. Available online: https://pawtrack.com/ (accessed on 6 April 2018).
  80. Ur Réhman, S.; Li, H. Using Vibrotactile Language for Multimodal Human Animals Communication and Interaction. In Proceedings of the 2014 Workshops on Advances in Computer Entertainment Conference—ACE ’14 Workshops, Funchal, Portugal, 11–14 November 2014; ACM Press: New York, NY, USA, 2014; pp. 1–5. [Google Scholar]
  81. Ribeiro, C.; Ferworn, A.; Denko, M.; Tran, J. Canine Pose Estimation: A Computing for Public Safety Solution. In Proceedings of the 2009 Canadian Conference on Computer and Robot Vision, Kelowna, BC, Canada, 25–27 May 2009; pp. 37–44. [Google Scholar]
  82. Valentin, G. Gestural Activity Recognition for Canine-Human Communication. In Proceedings of the 2014 ACM International Symposium on Wearable Computers Adjunct Program—SWC ’14 Adjunct, Seattle, WA, USA, 13–17 September 2014; ACM Press: New York, NY, USA, 2014; pp. 145–149. [Google Scholar]
  83. Valentin, G.; Alcaidinho, J.; Howard, A.; Jackson, M.M.; Starner, T. Creating Collar-Sensed Motion Gestures for Dog-Human Communication in Service Applications. In Proceedings of the 2016 ACM International Symposium on Wearable Computers—ISWC ’16, Heidelberg, Germany, 12–16 September 2016; ACM Press: New York, NY, USA, 2016; pp. 100–107. [Google Scholar]
  84. Van Eck, W.; Lamers, M. Animal Controlled Computer Games: Playing Pac-Man against Real Crickets. In International Conference on Entertainment Computing (ICEC) 2006; Lecture Notes in Computer Science (LNCS) 4161; Springer: Berlin/Heidelberg, Germany, 2006; pp. 31–36. [Google Scholar]
  85. Van Eck, W.; Lamers, M.H. Player Expectations of Animal Incorporated Computer Games. In Intelligent Technologies for Interactive Entertainment, INTETAIN 2017; Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering; Springer: Cham, Switzerland, 2018; Volume 215, pp. 1–15. [Google Scholar]
  86. Watanabe, S.; Izawa, M.; Kato, A.; Ropert-Coudert, Y.; Naito, Y. A New Technique for Monitoring the Detailed Behaviour of Terrestrial Animals: A Case Study with the Domestic Cat. Appl. Anim. Behav. Sci. 2005, 94, 117–131. [Google Scholar] [CrossRef]
  87. Weilenmann, A.; Juhlin, O. Understanding People and Animals: The Use of a Positioning System in Ordinary Human-Canine Interaction. In Proceedings of the 2011 Annual Conference on Human Factors in Computing Systems—CHI ’11, Vancouver, BC, Canada, 7–12 May 2011; ACM Press: New York, NY, USA, 2011; pp. 2631–2640. [Google Scholar]
  88. Weiss, G.M.; Nathan, A.; Kropp, J.B.; Lockhart, J.W. WagTag: A Dog Collar Accessory for Monitoring Canine Activity Levels. In Proceedings of the 2013 ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication, Zurich, Switzerland, 8–12 September 2013; pp. 405–414. [Google Scholar]
  89. Wingrave, C.A.; Rose, J.; Langston, T.; LaViola, J.J.J. Early Explorations of CAT: Canine Amusement and Training. In Proceedings of the CHI ’10 Extended Abstracts on Human Factors in Computing Systems, Atlanta, GA, USA, 10–15 April 2010; pp. 2661–2669. [Google Scholar]
  90. Yonezawa, K.; Miyaki, T.; Rekimoto, J. Cat@Log: Sensing Device Attachable to Pet Cats for Supporting Human-Pet Interaction. In Proceedings of the International Conference on Advances in Computer Enterntainment Technology—ACE ’09, Athens, Greece, 29–31 October 2009; ACM Press: New York, NY, USA, 2009; pp. 149–156. [Google Scholar]
  91. Zeagler, C.; Byrne, C.; Valentin, G.; Freil, L.; Kidder, E.; Crouch, J.; Starner, T.; Jackson, M.M. Search and Rescue: Dog and Handler Collaboration through Wearable and Mobile Interfaces. In Proceedings of the Third International Conference on Animal-Computer Interaction—ACI ’16, Milton Keynes, UK, 15–17 November 2016; ACM Press: New York, NY, USA, 2016; pp. 1–9. [Google Scholar]
  92. Whistle GPS Tracker for Dogs. Available online: https://www.whistle.com (accessed on 20 March 2018).
  93. Kobayashi, H.; Muramatsu, K.; Okuno, J.; Nakamura, K.; Fujiwara, A.; Saito, K. Playful Rocksalt System: Animal-Computer Interaction Design in Wild Environments. In Proceedings of the 12th International Conference on Advances in Computer Entertainment Technology—ACE ’15, Iskandar, Malaysia, 16–19 November 2015; ACM Press: New York, NY, USA, 2015; pp. 1–4. [Google Scholar]
  94. Alfrink, K.; van Peer, I.; Lagerweij, H.; Driessen, C.; Bracke, M.; Copier, M. Pig Chase. Playing with Pigs Project. Available online: www.playingwithpigs.nl (accessed on 6 April 2018).
  95. Apps for Apes. Available online: https://redapes.org/multimedia/apps-for-apes/ (accessed on 6 April 2018).
  96. Kerepesi, A.; Dóka, A.; Miklósi, Á. Dogs and Their Human Companions: The Effect of Familiarity on Dog-human Interactions. Behav. Process. 2015, 110, 27–36. [Google Scholar] [CrossRef] [PubMed]
  97. Martin, C.F.; Shumaker, R.W. Great Ape Touch-Panel Tasks: A Platform for Research, Enrichment, and Conservation. In Proceedings of the HCI Goes to the Zoo, CHI 2016 Workshops, San Jose, CA, USA, 7–12 May 2016; pp. 1–5. [Google Scholar]
  98. Ohta, N.; Nishino, H.; Takashima, A.; Cheok, A.D. Animal-Human Digital Interface: Can Animals Collaborate with Artificial Presences? In Proceedings of the Measuring Behavior 2016, Dublin, Ireland, 25–27 May 2016; pp. 455–458. [Google Scholar]
  99. Rossi, A.P.; Rodriguez, S.; Cardoso dos Santos, C.R. A Dog Using Skype. In Proceedings of the Third International Conference on Animal-Computer Interaction—ACI ’16, Milton Keynes, UK, 15–17 November 2016; ACM Press: New York, NY, USA, 2016; pp. 1–4. [Google Scholar]
  100. Wallis, L.J.; Range, F.; Kubinyi, E.; Chapagain, D.; Serra, J.; Huber, L. Utilising Dog-Computer Interactions to Provide Mental Stimulation in Dogs Especially during Ageing. In Proceedings of the Fourth International Conference on Animal-Computer Interaction (ACI 2017), Milton Keynes, UK, 21–23 November 2017; pp. 1–12. [Google Scholar]
  101. Webber, S.; Carter, M.; Smith, W.; Vetere, F. Interactive Technology and Human-Animal Encounters at the Zoo. Int. J. Hum. Comput. Stud. 2017, 98, 150–168. [Google Scholar] [CrossRef]
  102. Wirman, H. Games For/with Strangers—Captive Orangutan (Pongo Pygmaeus) Touch Screen Play. Antennae 2014, 30, 105–115. [Google Scholar]
  103. Wirman, H.E.; Jørgensen, I.K.H. Designing for Intuitive Use for Non-Human Users. In Proceedings of the 12th International Conference on Advances in Computer Entertainment Technology—ACE ’15, Iskandar, Malaysia, 16–19 November 2015; ACM Press: New York, NY, USA, 2015; pp. 1–8. [Google Scholar]
  104. Zeagler, C.; Gilliland, S.; Freil, L.; Starner, T.; Jackson, M. Going to the Dogs: Towards an Interactive Touchscreen Interface for Working Dogs. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology—UIST ’14, Honolulu, HI, USA, 5–8 October 2014; ACM Press: New York, NY, USA, 2014; pp. 497–507. [Google Scholar]
  105. Dong, R.; Carter, M.; Smith, W.; Joukhadar, Z.; Sherwen, S.; Smith, A. Supporting Animal Welfare with Automatic Tracking of Giraffes with Thermal Cameras. In Proceedings of the 29th Australian Conference on Computer-Human Interaction—OZCHI ’17, Brisbane, Australia, 28 November–1 December 2017; pp. 386–391. [Google Scholar]
  106. Grant, R.A.; Hewitt, B. Use of Image Processing Techniques to Quantify Sensory and Motor Behaviors in Rodents by Measuring Whisker Movements. In Proceedings of the Measuring Behavior 2016, Dublin, Ireland, 25–27 May 2016; pp. 20–22. [Google Scholar]
  107. Hirskyj-Douglas, I.; Luo, H.; Read, J.C. Is My Dog Watching TV? In Proceedings of the NordiCHI’14—Workshop on Animal-Computer Interaction (ACI): Pushing Boundaries beyond “Human”, Helsinki, Finland, 26–30 October 2014. [Google Scholar]
  108. North, S. Salient Features, Combined Detectors and Image Flipping: An Approach to Haar Cascades for Recognising Horses and Other Complex, Deformable Objects. In Proceedings of the Fourth International Conference on Animal-Computer Interaction—ACI2017, Milton Keynes, UK, 21–23 November 2017; ACM Press: New York, NY, USA, 2017; pp. 1–6. [Google Scholar]
  109. North, S.; Hall, C.; Roshier, A.; Mancini, C. HABIT: Horse Automated Behaviour Identification Tool—A Position Paper. In Proceedings of the British Human Computer Interaction Conference—Animal Computer Interaction Workshop, Lincoln, UK, 13–17 July 2015; pp. 1–4. [Google Scholar]
  110. Williams, F.J.; Mills, D.S.; Guo, K. Development of a Head-Mounted, Eye-Tracking System for Dogs. J. Neurosci. Methods 2011, 194, 259–265. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  111. Webber, S.; Carter, M.; Sherwen, S.; Smith, W.; Joukhadar, Z.; Vetere, F. Kinecting with Orangutans: Zoo Visitors’ Empathetic Responses to Animals’ Use of Interactive Technology. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems—CHI ’17, Brisbane, Australia, 28 November–1 December 2017; ACM Press: New York, NY, USA, 2017; pp. 6075–6088. [Google Scholar]
  112. Somppi, S.; Törnqvist, H.; Hänninen, L.; Krause, C.; Vainio, O. Dogs Do Look at Images: Eye Tracking in Canine Cognition Research. Anim. Cogn. 2012, 15, 163–174. [Google Scholar] [CrossRef] [PubMed]
  113. Amir, S.; Zamansky, A.; van der Linden, D. K9-Blyzer—Towards Video-Based Automatic Analysis of Canine Behavior. In Proceedings of the Fourth International Conference on Animal-Computer Interaction—ACI2017, Milton Keynes, UK, 21–23 November 2017; ACM Press: New York, NY, USA, 2017; pp. 1–5. [Google Scholar]
  114. Mealin, S.; Domínguez, I.X.; Roberts, D.L. Semi-Supervised Classification of Static Canine Postures Using the Microsoft Kinect. In Proceedings of the Third International Conference on Animal-Computer Interaction—ACI ’16, Milton Keynes, UK, 15–17 November 2016; ACM Press: New York, NY, USA, 2016; pp. 1–4. [Google Scholar]
  115. Oczak, M.; Maschat, K.; Berckmans, D.; Vranken, E.; Baumgartner, J. Automatic Estimation of Number of Piglets in a Pen during Farrowing, Using Image Analysis. Biosyst. Eng. 2016, 151, 81–89. [Google Scholar] [CrossRef]
  116. Rumbaugh, D.M. Language Learning by a Chimpanzee: The Lana Project; Academic Press: New York, NY, USA, 1977. [Google Scholar]
  117. McCowan, D.R.B. Spontaneous Vocal Mimicry and Production by Bottlenose Dolphins (Tursiops Truncatus): Evidence for Vocal Learning. J. Comp. Psychol. 1993, 107, 301–312. [Google Scholar]
  118. French, F.; Mancini, C.; Sharp, H. Exploring Methods for Interaction Design with Animals: A Case-Study with Valli. In Proceedings of the Third International Conference on Animal-Computer Interaction—ACI ’16, Milton Keynes, UK, 15–17 November 2016; ACM Press: New York, NY, USA, 2016; pp. 3:1–3:5. [Google Scholar]
  119. Singh, A.; Young, J.E. A Dog Tail for Utility Robots: Exploring Affective Properties of Tail Movement. In IFIP Conference on Human-Computer Interaction; Lecture Notes in Computer Science Lect; Springer: Berlin/Heidelberg, Germany, 2013; Volume 8118, pp. 403–419. [Google Scholar]
  120. Haraway, D. When Species Meet; University of Minnesota Press: Minneapolis, MN, USA, 2007. [Google Scholar]
  121. Hirskyj-Douglas, I.; Read, J.C. Who Is Really In The Center Of Dog Computer Design? In Proceedings of the 2014 Workshops on Advances in Computer Entertainment Conference—ACE ’14 Workshops, Funchal, Portugal, 11–14 November 2014; ACM Press: New York, NY, USA, 2014; pp. 2:1–2:5. [Google Scholar]
  122. Sreelakshmi, M.; Subash, T.D. Haptic Technology: A Comprehensive Review on Its Applications and Future Prospects. Mater. Today Proc. 2017, 4, 4182–4187. [Google Scholar] [CrossRef]
  123. Mann, S. The Encyclopedia of Human-Computer Interaction, 2nd Ed. Wearable Computing. Available online: https://www.interaction-design.org/literature/book/the-encyclopedia-of-human-computer-interaction-2nd-ed/wearable-computing (accessed on 20 May 2018).
  124. Hirskyj-Douglas, I.; Read, J.C. Using Behavioural Information to Help Owners Gather Requirements from Their Dogs’ Responses to Media Technology. In Proceedings of the Human Computer Interaction British 2016, Poole, UK, 11–15 July 2016; pp. 1–13. [Google Scholar]
  125. Campion, M. History of Eye Tracking Studies and Technology. Look Tracker. Available online: https://www.looktracker.com/blog/eye-tracking-technology/the-history-of-eye-tracking-studies-and-technology/ (accessed on 13 August 2017).
  126. Nahm, F.K.D.; Perret, A.; Amaral, D.G.; Albright, T.D. How Do Monkeys Look at Faces? J. Cogn. Neurosci. 1997, 9, 611–623. [Google Scholar] [CrossRef] [PubMed]
  127. Leach, M.C.; Klaus, K.; Miller, A.L.; Scotto di Perrotolo, M.; Sotocinal, S.G.; Flecknell, P.A. The Assessment of Post-Vasectomy Pain in Mice Using Behaviour and the Mouse Grimace Scale. PLoS ONE 2012, 7, e35656. [Google Scholar] [CrossRef] [PubMed]
  128. Jacob, R.J.K.; Karn, K.S. Eye Tracking in Human-Computer Interaction and Usability Research: Ready to Deliver the Promises. In The Mind’s Eye; Elsevier: Amsterdam, The Netherlands, 2003; pp. 573–605. [Google Scholar]
  129. Poole, A.; Ball, L.J. Eye Tracking in HCI and Usability Research. In Encyclopedia of Human Computer Interaction; IGI Global: Hershey, PA, USA, 2005; pp. 211–219. [Google Scholar]
  130. Pons, P.; Jaen, J.; Catala, A. Assessing Machine Learning Classifiers for the Detection of Animals’ Behavior Using Depth-Based Tracking. Expert Syst. Appl. 2017, 86, 235–246. [Google Scholar] [CrossRef]
  131. Faragó, T.; Miklósi, Á.; Korcsok, B.; Száraz, J.; Gácsi, M. Social Behaviours in Dog-Owner Interactions Can Serve as a Model for Designing Social Robots. Interact. Stud. 2014, 15, 143–172. [Google Scholar] [CrossRef]
  132. Racca, A.; Amadei, E.; Ligout, S.; Guo, K.; Meints, K.; Mills, D. Discrimination of Human and Dog Faces and Inversion Responses in Domestic Dogs (Canis Familiaris). Anim. Cogn. 2010, 13, 525–533. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  133. Guo, K.; Meints, K.; Hall, C.; Hall, S.; Mills, D. Left Gaze Bias in Humans, Rhesus Monkeys and Domestic Dogs. Anim. Cogn. 2009, 12, 409–418. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  134. Hirskyj-Douglas, I.; Read, J.C.; Juhlin, O.; Väätäjä, H.; Pons, P.; Hvasshovd, S.-O. Where HCI Meets ACI. In Proceedings of the 9th Nordic Conference on Human-Computer Interaction—NordiCHI ’16, Gothenburg, Sweden, 23–27 October 2016; ACM Press: New York, NY, USA, 2016; pp. 1–3. [Google Scholar]
  135. PupPod Smart Dog Toys. Available online: https://puppod.com/ (accessed on 27 March 2018).
  136. Nannoni, E.; Martelli, G.; Sardi, L. Enrichments for Pigs: Improving Animal-Environment Relations. In Proceedings of the Fourth International Conference on Animal-Computer Interaction—ACI2017, Milton Keynes, UK, 21–23 November 2017; ACM Press: New York, NY, USA, 2017; pp. 1–6. [Google Scholar]
  137. Hirskyj-Douglas, I.; Read, J.C. The ethics of how to work with dogs in animal computer interaction. In Proceedings of the Animal Computer Interaction Symposium, Measuring Behaviour 2016, Dublin, Ireland, 25–27 May 2016. [Google Scholar]
  138. Väätäjä, H. Animal Welfare as a Design Goal in Technology Mediated Human-Animal Interaction. In Proceedings of the 2014 Workshops on Advances in Computer Entertainment Conference—ACE ’14 Workshops, Funchal, Portugal, 11–14 November 2014; pp. 1–8. [Google Scholar]
  139. Väätäjä, H.; Pesonen, E. Ethical issues and guidelines when conducting HCI studies with animals. In Proceedings of the CHI ’13 Extended Abstracts on Human Factors in Computing Systems, Paris, France, 27 April–2 May 2013; pp. 2159–2168. [Google Scholar]
  140. Kobayashi, H.; Kudo, H. Carrier Pigeon-like Sensing System: Beyond Human-Red Forest Interactions. In Proceedings of the Balance-Unbalance International Conference, Noosa, Australia, 31 May–2 June 2013; pp. 1–15. [Google Scholar]
  141. Reiss, D.; Gabrie, P.; Gershenfeld, N.; Cerf, V. The Interspecies Internet? An Idea in Progress. TedX Talk. Available online: https://www.ted.com/talks/the_interspecies_internet_an_idea_in_progress (accessed on 31 March 2018).
Figure 1. Representation of the gulf of execution in ACI systems.
Figure 1. Representation of the gulf of execution in ACI systems.
Mti 02 00030 g001
Figure 2. Framework for technologies in ACI (building from Jukan et al. [18]).
Figure 2. Framework for technologies in ACI (building from Jukan et al. [18]).
Mti 02 00030 g002
Figure 3. Button system used as a pressure plate to dispense treats [30].
Figure 3. Button system used as a pressure plate to dispense treats [30].
Mti 02 00030 g003
Figure 4. A dog activating a switch [21]. Photo courtesy of The Open University.
Figure 4. A dog activating a switch [21]. Photo courtesy of The Open University.
Mti 02 00030 g004
Figure 5. Posture system used by Majikes et al. [36].
Figure 5. Posture system used by Majikes et al. [36].
Mti 02 00030 g005
Figure 6. Olfaction cancer detection system [41]. Photo courtesy of The Open University.
Figure 6. Olfaction cancer detection system [41]. Photo courtesy of The Open University.
Mti 02 00030 g006
Figure 7. Dog training to click on points on a touchscreen interface [8].
Figure 7. Dog training to click on points on a touchscreen interface [8].
Mti 02 00030 g007
Figure 8. Apps for Apes: An orangutan using a touchscreen [101].
Figure 8. Apps for Apes: An orangutan using a touchscreen [101].
Mti 02 00030 g008
Figure 9. Human and orangutan playing together with a touchscreen interface [97].
Figure 9. Human and orangutan playing together with a touchscreen interface [97].
Mti 02 00030 g009
Figure 10. Using a head-mounted, eye-tracking system with dogs [110].
Figure 10. Using a head-mounted, eye-tracking system with dogs [110].
Mti 02 00030 g010
Figure 11. Tracking cats using depth measurement via an Xbox Kinect to detect posture [51].
Figure 11. Tracking cats using depth measurement via an Xbox Kinect to detect posture [51].
Mti 02 00030 g011
Figure 12. Tracking dogs using posture recognition via an Xbox Kinect [114].
Figure 12. Tracking dogs using posture recognition via an Xbox Kinect [114].
Mti 02 00030 g012
Table 1. Classification of Interactive Technologies in Animal Computer Interaction.
Table 1. Classification of Interactive Technologies in Animal Computer Interaction.
ClassificationPapersInterfaceSpeciesAim
Tangible & Physical[4,21,30,33,43,54,55,56,57,58,59,60,61,62,63,64,65]Animal-Robotic Interfaces
Ball Toys
Button Systems
Music Touch Systems
Tug Toys
Treat Systems
Chickens
Dogs
Elephants
Orangutans
Control
Communication
Enrichment
Human Health
Playful
Welfare
Haptic & Wearable[6,11,14,35,36,38,40,50,52,55,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92]Biotelemetry Collars
GPS Collars
Inertial Measurement Units (IMU)
Vibrotactile Vest
Vibrotactile Button
Vibrotactile Plates
Wearable
Chickens
Cows
Crickets
Dogs
Elephants
Pigs
Control
Communication
Service Animal
Working Animal
Olfactory[37,41,48,93]Deer cracker (food)
Pressure sensors
Smell interface
Deer
Dogs
Control
Communication
Working Animal
Screen Technology[3,8,31,33,42,44,53,94,95,96,97,98,99,100,101,102,103,104]Interactive Walls
Tablets
Touchscreens
TV Screens
Cats
Dogs
Orangutans
Pigs
Enrichment
Human Health
Playful
Service Animal
Working Animal
Tracking Technology[11,51,52,53,105,106,107,108,109,110,111,112,113,114,115]Depth Sensors
RGB Cameras
Thermal Cameras
Cats
Chickens
Dogs
Giraffes
Horses
Mice
Orangutans
Pigs
Enrichment
Monitoring
Playful

Share and Cite

MDPI and ACS Style

Hirskyj-Douglas, I.; Pons, P.; Read, J.C.; Jaen, J. Seven Years after the Manifesto: Literature Review and Research Directions for Technologies in Animal Computer Interaction. Multimodal Technol. Interact. 2018, 2, 30. https://doi.org/10.3390/mti2020030

AMA Style

Hirskyj-Douglas I, Pons P, Read JC, Jaen J. Seven Years after the Manifesto: Literature Review and Research Directions for Technologies in Animal Computer Interaction. Multimodal Technologies and Interaction. 2018; 2(2):30. https://doi.org/10.3390/mti2020030

Chicago/Turabian Style

Hirskyj-Douglas, Ilyena, Patricia Pons, Janet C. Read, and Javier Jaen. 2018. "Seven Years after the Manifesto: Literature Review and Research Directions for Technologies in Animal Computer Interaction" Multimodal Technologies and Interaction 2, no. 2: 30. https://doi.org/10.3390/mti2020030

Article Metrics

Back to TopTop