Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3613904.3642649acmconferencesArticle/Chapter ViewFull TextPublication PageschiConference Proceedingsconference-collections
research-article
Open access

Co-designing the Collaborative Digital Musical Instruments for Group Music Therapy

Published: 11 May 2024 Publication History

Abstract

Digital Musical Instruments (DMIs) have been integrated into group music therapy, providing therapists with alternative ways to engage in musical dialogues with their clients. However, existing DMIs used in group settings are primarily designed for individual use and often overlook the social dynamics inherent in group therapy. Recognizing the crucial role of social interaction in the effectiveness of group therapy, we argue that Collaborative Digital Musical Instruments (CDMIs), seamlessly integrating social interaction with musical expression, hold significant potential to enhance group music therapy. To better tailor CDMIs for group music therapy, we engaged in a co-design process with music therapists, designing and practicing group therapy sessions involving the prototype ComString. In the end, we reflected on the co-design case to suggest future directions for designing CDMIs in group music therapy.
Figure 1:
Figure 1: (a) The prototype of a CDMI: "ComString". (b) Apply ComString in a music therapy group for stress-reducing.(c) Three people play the ComString collaboratively.

1 Introduction

Group music therapy is a form of music therapy in a setting with multiple clients, combining techniques from group psychotherapy, to achieve personalized therapeutic goals. Typically, a professionally certified music therapist is required for conducting music therapy sessions[3], guiding clients to achieve therapeutic goals through engaging them in various musical activities, such as improvisations with instruments, singing, and listening. [8, 14, 22, 43, 60]. Experiences emerge from those activities can have therapeutic effects on clients, improving their cognitive capability[8, 20, 41], emotional regulation[1, 44, 75], and motor coordination[49, 71]. Moreover, in group settings, where multiple clients engage collaboratively in musical activities, they can acquire emotional social support[37], including bonding, belonging, and connectedness[5, 38, 65]. These positive social emotions have been shown to also contribute to increased emotional well-being and stress-reducing effects[13, 33], ultimately benefiting individual health[37].
With the advancement of digital technology in the field of health, researchers have begun to explore its integration with music therapy, aiming to bring a better quality of life and well-being. One approach is to develop novel interactive music interfaces for therapeutic use. As Agres et al.[2] suggest, this approach can assist therapists in conducting their therapy sessions and can assist clients in non-clinical settings between therapy sessions. Some music therapists have already incorporated commercial electronic instruments[10, 68] and novel interactive music interfaces[29, 52, 61, 63] into their therapy sessions, aiming to empower clients through a broader range of musical activities[50]. These interfaces that enable musical expression through digital technology can collectively be referred to as Digital Musical Instruments (DMIs). According to definition from Elena et al.[50], their form and mode of interaction may either continue a certain traditional instrument or represent a novel musical device that diverges from traditional instrument forms, such as soft toys[61], elastic screen[9], iPad[54], and human bodies[52, 63].
Currently, within our scope of knowledge, DMIs designed for music therapy mostly focus on innovating the individual interaction between the user and the instrument, lacking attention on enhancing social interaction formed among users through instruments. This status overlooks the design chance inherent in the multi-user characteristics of group therapy settings. In these settings, the interaction between individuals and their instruments is situated within a social context, where the instruments serve not only as a tool for musical performance but also takes on a role in facilitating communication with others[11]. This prompts us to consider DMIs as collaborative devices within a public social space[40]. In the future design of DMIs for group music therapy, it is crucial to not only contemplate how individuals interact with the instruments but also consider how the instruments mediate social interactions, facilitating individuals to communicate with others through DMIs.
To address it, we notice a sort of DMIs that are designed for multi-users[4, 16, 27, 31, 57, 70] may bring new experiences and alternative tools to therapists and clients in a group. As designed for multi-users, those DMIs generate sound through interpersonal interaction rather than individuals’ actions, such as through touching others’ bodies[27] or blowing a single brass tube from separated mouthpieces together[16], which connect people through not only the music but also their actions. Considering the music emerged from a collaborative activity through those DMIs, we refer them to CDMIs in this paper.
It is evident that CDMIs, as a category of DMIs designed towards multi-user interaction, have the potential to facilitate collaboration among group members, thereby promoting the effectiveness of group music therapy. This integration serves as one approach for combining the field of HCI and music therapy. However, the efficacy of CDMIs within therapy groups depends on the 1) how it is designed by designers, and the 2) how it is utilized by therapists. Currently, we have not identified relevant work applying CDMIs in group music therapy, leaving both 1) and 2) unclear. To explore these open questions and design CDMIs beneficial for group music therapy, we believe a robust approach is to combine the creative abilities of designers with the professional knowledge of therapists, accumulating a greater number of case studies through collaborative creation. Additionally, this process of designing CDMIs allows for the discovery of methods and challenges in the integration of HCI and Music Therapy.
For these reasons, as a preliminary exploration of applying CDMIs to group music therapy, we initially targeted a stress-reducing group to collaborate with two therapists in designing CDMIs for it. This group, open to the general public with or without emotional disorder, is easily accessible. It provides a safe emotional outlet, offering gentle interventions for well-being and as a preventive measure against mental illnesses. After participating in several stress-reducing groups and experiencing the use of traditional musical instruments in there, we first designed a prototype of CDMIs called ComString, and used it as a technology probe[30] to elicit therapists’ opinions about how CDMIs can support their therapy sessions. At the end of the discussion with therapists, we finished a plan of therapy sessions that involved ComString for stress-reducing group. Then, based on the plan, we iterated the ComString and recruited six interactive designers with stress relief needs to experience the therapy sessions. We hope that the designers would reflect on their experience in design perspectives. At the end, we worked with the therapists to conduct design retrospectives and reflections. This Co-design case contributes from the design as well as the application process of CDMIs:
(1)
A prototype assigned to CMDIs, which is named ComString.
(2)
A complete therapy plan and practice, integrated with CDMIs, for the stress-reducing group.
(3)
Several reflections for future CDMIs in the context of group music therapy.

2 Related Work

2.1 Group Music Therapy

Music therapy is a subcategory under the umbrella term of Arts Therapy, alongside other modalities such as dance-movement therapy and visual art therapy. These therapeutic forms share a common emphasis on achieving therapeutic effects through artistic activities. The definition and boundaries of Art Therapy remain ambiguous, primarily arising from the tension between two factions, namely, Art Psychotherapy and Art as Therapy[64]. The former emphasizes the application of psychotherapeutic methods in art activities, while the latter focuses more on individual expression and creative experiences in art activities. However, a consensus can be identified: Art Therapy should be led by professional therapists who hold credentialing and licensure[3, 51, 53]. This is due to their professional training, enabling them to guide clients in artistic activities to promote individual health.
Art therapy occurs in group settings, open to everyone. Therapists adjust therapeutic goals and techniques based on the members’ composition. Although real-world situations are more complex, we can distinguish two types of therapy groups based on the members’ mental health states. The first is homogeneous groups, where members share similar emotional disorders or common therapeutic goals. These groups are often encountered in outpatient mental health settings and persist over an extended period with the same members. The second type, which we refer to as universal groups, is open to general people without necessarily having a diagnosed emotional disorder. These one-time groups aim for prevention and experience, usually conducted in community settings, healthcare centers, and educational institutions. Members of such groups are often heterogeneous, and therapists may have limited prior knowledge about them, leading to the use of more universally applicable therapeutic techniques. This classification is also applicable to music therapy groups.

2.2 DMIs in Music Therapy

Early DMIs developed for music therapy were intended to provide individuals with motor disabilities the opportunity to play music despite their physical limitations, granting them equal therapeutic rights as the general population. For example, "Soundbeam1" uses ultrasonic detection to convert a user’s body movements into sound, allowing people to play the keyboard without physical contact[63] from the 90s. After that, Mark Finch et al. designed a music software for cerebral palsy patients, utilizing head movements as input[17]. In addition to addressing physical disabilities, there are also designs specifically tailored for patients with cognitive impairments, such as a special system which allows individuals with dementia to create music of various emotional tones simply by sliding their fingers on a screen [54].
In addition to enabling individuals to engage in music activities, DMIs can also be used to train patients’ motor and cognition abilities[34]. Some works combine game-like designs to achieve these rehabilitation goals, which are close to the category of serious games. For example, BendableSound[9] is an elastic surface for encouraging users to practice coordination movements, which provides a multisensory environment and trains patients’ physical movement abilities by designing music trigger points at different positions on the elastic screen, guiding them to perform predetermined body movements. These kinds of DMIs can be summarized as "Adaptive Digital Musical Instruments" [19] (ADMIs), which represent the mainstream use of digital instruments in the field of therapy [2].
Differing from the function-oriented DMIs above, therapists have also begun exploring the new possibilities brought to music therapy on a non-functional level, such as aesthetics, creativity, and communication. These types of instruments are also suitable for clients without cognitive and physical impairments. For example, in the practice of using interactive music dolls as a medium, Karette Stensæth[61] reflects on how digital instruments may change the dynamic between therapists, instruments, and clients, creating a more subjective agency for the instrument to engage in dialogue with clients, while therapists take on the role of an intimate friend. However, research on these innovative music mediums in the health contexts is relatively limited, and only few projects, such as the RHYME2 project[6] and a series of work from Hunt et al.[29], documented and discussed the design process. Other therapists involving non-functional therapy using DMIs mostly rely on off-the-shelf devices, such as various drum machines and MIDI keyboards[10, 68].
Currently, DMIs have indeed attracted the interest of therapists, while not being widely applied in actual therapeutic practices, due to the limited level of commercialization and the difficulty for most therapists of developing by themselves[24]. From the perspective of HCI researchers, developing DMIs for music therapy is an approach to fostering human health and well-being[2]. Following this approach, we have noticed a lack of attention to the multi-user aspects of group music therapy in current therapeutic DMI designs. Therefore, we use CDMIs as a starting point to discuss how to design therapeutic digital instruments specifically for groups.

2.3 Collaborative Digital Musical Instruments

In the rich body of works related to music in HCI, we are particularly interested in research involving collaborative music activities by multiple users[16, 27, 31, 57, 70, 73], as it can provide valuable references for designing CDMIs for therapeutical use. In such works, playing music becomes a collaborative act, which means individuals are no longer isolated but instead are constantly aware of the presence of others, and may even require mutual coordination and collaboration when using digital technology.
A significant portion of the work focuses on creating new types of musical interfaces to explore greater possibilities for music composition and performance. Some of these interfaces necessarily require collaboration among multiple users to generate music. For instance, in the case of [16], two people collaborate to play a new musical instrument consisting of a single brass tube. By modulating the pressure inside the tube, different sound effects can be achieved, which explores and reflects upon the interpersonal relationships from the perspective of an instrument’s original form. Other interfaces may aim at providing collaborative music creation tools for multi-users. For example, Favilla et al.[15] developed software on a PAD that allows individuals to collaborate on music composition, which highlights that digital technology can enable some specific patients to enjoy the pleasure of collaborative music performance.
Another part of the work focuses on the social aspects brought about by multi-users, where music and musical interfaces serve as a special means to deeply engage in social interaction. For example, in the study of Xambó et al. [73], a touchable musical interface for multiplayer interaction in public museum spaces is designed and installed, highlighting the social interaction modes between visiting groups from the perspective of behavioral observation.
Some studies have delved deeper into the emotional experience during interactions. For example, Florian Mueller et al. designed a music game that involves two people coordinating movements with a body squeeze pillow, aiming to explore whether the awkwardness brought about by breaking social distance can be alleviated through gameplay[45]. On the other hand, Mads Hobye and Jonas Löwgren transformed physical contact between individuals into music, and found such interaction rules made touch, which might be challenging among strangers, more acceptable. Through this highly intimate musical interaction, a positive sense of connecting and belonging was fostered[27].
Although all of the CDMIs in studies mentioned above do have the potential to be applied in group music therapy, they are more focused on the contexts such as public spaces[4, 57, 73], events or shows[27, 31, 46], or education[69]. There is a lack of identified cases where they are truly involved in musical therapy groups. In order to explore the effective integration of novel technology in the field of HCI with the systematic knowledge in the field of music therapy, this paper undertook an initial endeavor. It provides a case of a co-design practice where CDMIs were truly applied in a group therapy setting.

2.4 Technologies for Collocated and Emotional Social Interaction

From a societal perspective, CDMIs are interfaces that facilitate collaboration through music. Fostering collaboration within musical groups can garner social and emotional support, thereby promoting therapeutic effects, such as stress reduction[13]. Therefore, design strategies aimed at promoting collaboration or emotional support can serve as valuable references for the multi-user interaction design of CDMIs. Olsson et al.[47] conducted a comprehensive review of "collocated interaction", wherein individuals interact in the same physical space and at the same time. They summarized design strategies and social interaction objectives, proposing four roles of technology: enabling, facilitating, inviting, and encouraging social interaction within this context. Moreover, other researchers place less emphasis on collocation and focus more on how interpersonal emotions are mediated or even enhanced through technologies. For instance, Stepanova et al.[62] disclosed design strategies that foster a genuine feeling of connection, while Hassenzahl et al.[26] focused on a sense of belonging in intimate relationships. Both studies highlighted that increasing awareness of others contributes to the construction of positive social relationships and experiences. This part of the work has provided design inspiration for us to enhance the CDMIs’ ability for multi-user collaboration.
The above work provides us with rich theoretical support and case references in terms of music therapy and interactive instrument design, but more design practices are needed to leverage how these design strategies can strengthen CDMIs in a group music therapy setting, to promote social interactions and musical collaboration among users, thereby making CDMIs more beneficial to group music therapy. Therefore, in this work, we propose the following progressive research questions, to prompt reflection at different stages of the design.
(1)
How to design CDMIs for group music therapy?
(2)
How do therapists and clients use the CDMIs?
(3)
What will be brought to therapists, clients, and designers when applying CDMIs in group music therapy sessions?

3 Research Method and Process

To answer the above questions, we followed the Research through Design (RtD) methodology[18, 21, 76] to explore the integration of CDMIs with group music therapy. While this paper primarily documents the co-design process of a specific CDMI named ComString and its incorporation into a stress-reducing group, we believe that broader knowledge for designing CDMIs can emerge from this practice, including design strategies of CDMIs in a music therapeutic context and co-design experiences with music therapists.

3.1 Design Research Method

RtD is a method that employs design to generate knowledge, aiming to discover ’what might be the right thing.’[21] Through creating and applying artifacts to uncover new understandings into research questions. In the context of this paper, the utilization of this method implies the necessity to design a CDMI and have therapists use it in a musical group setting.
As a starting point for designing a CDMI, it is essential to target a specific type of therapy group. The type of group significantly influences how therapists utilize musical instruments[72], thereby shaping the design considerations for CDMIs. For example, in groups focused on rehabilitation, instruments are utilized for training motor abilities, while in groups aimed at emotion regulation, instruments are employed for improvisation. The design considerations for these two groups differ significantly.
Once the target group was identified, we established a design team consisting of one designer and one engineer, both of whom have a basic level of musical abilities. Then following the RtD approach, we get insight into the relationship between therapists, CDMIs, and participants through a co-design practice. As illustrated in Figure 2, the design process was structured into three phases:
Figure 2:
Figure 2: The Design Process
Stage 1
Through participatory observation, we developed a technology probe[30] named ComString that enables collaborative music interaction between devices.
Stage 2
Based on the ComString, we negotiate with therapists about how to apply it in therapy sessions. After generating a plan of therapy sessions, we iterated on the interaction rules of the prototypes to align with the plan.
Stage 3
Six interaction designers were invited to participate in therapy sessions utilizing ComString. After the sessions, a focus group was conducted to reflect on participants’ experiences. In the process of analyzing the data, individual follow-up interviews were conducted with both the participants and the therapists.
The subsequent sections will provide a detailed description of considerations and objectives within each of the whole research process.

3.2 Design Process

At first, we target the design context of the stress-reducing group conducted irregularly in schools. Following the background in section 2.1, it is a universal group open to anyone seeking stress relief who can voluntarily sign up for participation. Since this study focuses on exploring the incorporation of collaborative DMIs rather than addressing therapeutic issues specific to particular groups, the generalizability of therapeutic techniques in stress-reducing groups makes them suitable for conducting exploratory research.

3.2.1 Stage 1: Initial Prototyping.

The purpose of a technology probe is to uncover others’ perspectives and experiences with the design objectives in a real-world setting[30]. By initially implementing basic functionalities that convey the design objectives with minimal costs, it provides stakeholders with ample room for imagination. This facilitates discussions with them on the future direction of the artifact.
In this study, we aim to develop a prototype belonging to CDMIs that combines both social and musical expressive capabilities, with a focus on understanding therapists’ perspectives regarding instruments that can facilitate social interaction. This prototype serves a dual purpose: firstly, it functions as DMIs suitable for use within stress-reducing groups, and secondly, it enables group members to engage in collaborative music activity. Currently, DMIs employed in group music therapy are not suitable to be modified into CDMIs for stress-reducing groups. Adaptive DMIs[19] designed for specific populations may not be suitable for the general public in stress-reducing groups, and off-the-shelf devices[68] are difficult to further develop into collaborative instruments. Therefore, we resolved to develop CDMIs for stress-reducing groups from scratch.
Based on participatory observations within stress-reducing groups, we have developed an instrument based on a single string that enables collaboration through communication among multi-instruments.

3.2.2 Stage 2: Co-design the Plan.

In this phase, therapists collaborated as co-designers with the shared goal of incorporating CDMIs into their group therapy sessions. Based on ComString, we engaged with two therapists sequentially. Therapist K(tK), who has been working at the university for over 5 years with extensive professional experience, provided valuable insights but was unable to develop a complete plan for the group therapy session due to heavy work commitments. Therapist Z(tZ), who received a standardized four-year education in music therapy at the Central Conservatory of Music in China, has one year of post-graduation work experience. Although tZ has less experience than tK, tZ has demonstrated an effective ability to lead therapy groups.
During the interviews with tK, we clarified the design aims of the musical instrument and its potential for social extension. Subsequently, based on tK’s feedback, when we worked with tZ, we prepared more detailed interaction proposals in addition to the prototype to further evoke tZ’s association with more available group techniques. Following multiple discussions with tZ, a plan of group therapy sessions was collaboratively developed.

3.2.3 Stage 3: Therapy Group Practice.

The final stage of the study comprised a half-day workshop divided into two parts. The first part involved participants experiencing the therapy sessions under the guidance of tZ. The second part took a focus group held after the group sessions, aiming to delve into the experiences and reflections of the participants. The purpose of having participants join the therapy sessions was to get some practical design experience about CDMIs. Considering that personal background influences reflections and descriptions of experiences, we convened participants who were researchers or designers with backgrounds in interaction design, HCI, and a need to decompress, to gain more design-related perspectives.
During the first part, to ensure that participants’ experiences close to a normal stress-reducing group, the entire group sessions aimed to faithfully replicate the common procedure led by music therapists, with minimal intervention from us. As observers, we only intervened when tZ needed us to support her in introducing the prototype. To gain a better understanding of participants’ performance and experiences, the Hospital Anxiety and Depression Scale (HADS) was administered prior to the group activities to assess the psychological well-being of the participants.
In the second part, to elicit the tacit experience of participants during therapy sessions, we learned from the soma trajectory[66] to have participants plot memorable events as points on the x-axis, while important bodily sensations and emotional changes were represented on the y-axis to create their own experiential trajectory. This approach aimed to articulate subjective experiences that are typically challenging to unearth in a clear and written form. Additionally, during the data analysis phase following the workshop, we revisited certain participants to further explore and discuss phenomena and sensations observed during subsequent reflections that were not explicitly addressed in the initial experiential sharing.

4 Stage 1: Initially Prototyping ComStrings

4.1 Design Consideration

In order to demonstrate the collaborative capabilities of CDMIs within a prototype while retaining their fundamental functionality as digital musical instruments and allowing ample room for discussion with therapists regarding the prototype’s developmental direction, we established two design objectives:
O1: Providing Alternative DMI for Stress-reducing Group. When designing digital instruments for stress-reducing groups, our aim was to enhance the group experience by providing participants with a broader array of choices. Digital instruments differentiate themselves from traditional instruments in both interaction modalities and sonic expressions. Sound can be freely chosen, while the interaction modality is determined by the instrument’s form. To understand which instruments might enable participants to explore a wider range of interaction modalities, we conducted field research by participating in stress-reducing groups organized by local institutions. We discovered that therapists often offer diverse instruments, but the complexity of these instruments can limit participants’ choices: percussion instruments and everyday objects that produce sound (such as newspapers or mineral water bottles) are commonly chosen by clients, while string instruments are rarely selected by those without a musical background. This limited access to interaction with string instruments. However, string instruments, being common melodic instruments, offer not only distinct interaction modes compared to percussion but also melodic musical expression. Digital instruments are often regarded as a means to lower the threshold for complex instruments, enabling participants with no musical background to explore various forms of musical interaction. Digitizing string instruments might provide stress relief groups with additional therapeutic tools. Hence, we outline the following two specific targets. The first is to simplify the rules for playing string instruments: Streamline the complexity of string instruments to make them user-friendly. The second is to maintain the affordance of a string: preserve the interaction form with strings, allowing individuals to naturally produce sound based on their prior knowledge.
O2: Supporting Multi-User Collaboration and Enhancing Social Interaction. Currently, the collaborative improvisation approach within groups involves each member independently using their instrument, creating cooperation at the music level. This turns instruments into tools for communication among members. However, we observed that when using instruments independently, individuals may become engrossed in playing their instruments, losing focus on others or even performing in isolation without coordination with others. Digital technology offers additional collaborative methods to facilitate communication among group members. In this study, we were inspired by design strategies summarized by Olsson et al. [47], focusing primarily on two strategies: 1) Disclosing information about others: enhancing the social presence of others, making them more easily or frequently noticed. 2) Introducing Constraints: employing mechanisms to foster cooperation, thus compelling interactions among individuals.
Incorporating these strategies into instrument design allows participants to become more aware of the presence of other group members while using their instruments. It also provides more potential for collaboration, may improve a sense of connection within the group, and consequently yields stress relief through a social approach.

4.2 Interaction between User and Instrument

To fulfill the first design objective, in designing user interactions with the string instrument, we initially simplified the number of strings to one. Next, we referred to Rossmy et al.’s work [56] summarizing interaction gestures for string instruments, selecting two simple and easily learnable gestures: "pull-off" and "sliding." These gestures are commonly used in playing widely popular string instruments like the guitar, making them closer to people’s prior knowledge of string instruments. Ultimately, we decided to map sound output based on the positions of these two gestures on the string:
Single tone mapping: The string is divided into eight segments, each corresponding to an octave interval. Pulling off at these eight segments triggers eight distinct sound outputs. These eight preset sounds can be freely combined, such as different pitches from the same instrument or different timbres from the same pitch.
Continuous mapping: Distance values along the string correspond to a continuous parameter of sound, such as volume or frequency. This allows for continuous control of sound variations through sliding fingers on the string, either controlling a waveform or an audio file.

4.3 Interaction Mediated by Instruments between Users

To achieve the second design objective, we have designed the instrument as a portable mobile device, allowing individuals to move freely and equipping it with the capability to provide information about others and support multi-user interactions. Each device can communicate with others wirelessly.

4.3.1 Disclosing playing states about others.

Lighting and vibrations are utilized to enhance social awareness, allowing each user to perceive the presence of others from their instruments, and amplifying their social presence. In addition to hearing others’ performances, the actions of others during their performance are also conveyed through dynamic lighting and string vibrations.
Through lighting, the devices display the dynamic actions of others’ performances and reinforce collaborative behavior:
The position of one’s hand on the strings of a device is synchronously reflected on the LEDs of other devices in real-time, displayed as bright white light. The light points representing one’s input are also white but dimmer compared to those of others.
When two group members have touches at the same location on their respective devices, the lighting transitions to a warm yellow hue. When three members simultaneously touch the same location, the lighting shifts to a warm orange hue. This serves to enhance collaborative behavior, signaling to users that someone is interacting with their device in a similar way.
Through vibration, the device synchronizes one user’s pull-off action with two other devices. If one device is not used, the owner of this device will observe the strings vibrating in response to the other’s performance. If they lightly touch the strings, they can perceive the sensation of the strings being plucked by the user.

4.3.2 Adding Collaborative Constrain.

We can use digital technology and enforced social constraints to make the instrument playable only when multiple people cooperate. We have two preset constraints:
(1)
Simultaneous string plucking for sound production. To produce sound, three individuals must simultaneously pluck the strings following a predefined chord sequence. According chord in is sounded based on the number of successful attempts.
(2)
Shared touch location. Sound is produced when two individuals touch the same position on the strings, resulting in a triad chord. When three individuals touch the same position, a seventh chord is played.
It’s important to note that these proposals are intended to convey the instrument’s collaborative capabilities to therapists. The specific rules for collaboration within the group therapy should be discussed and determined in consultation with the therapist.

4.4 Implementation

Based on the above design, we developed two initial prototypes with basic functionality and interactive examples for communication with therapists. In the co-design process, to align with the group sessions(Fighre 5) designed by therapists, the prototype underwent iterations. This included new interaction rules, optimization of the network structure, an increase in the number of devices to three, and a change in sound output to high-quality speakers connected to the computer (in the initial version, each device had its own low-quality Bluetooth speaker).
Since the detailed iterations of the technical approach are not the primary focus of this paper, we will directly describe the final technical implementation here.

4.4.1 A Single DMI.

Figure 3:
Figure 3: Internal structure diagram of ComString
The structure of the prototype is depicted in Figure 3. It takes the form of a rectangular handheld device. The device frame is fabricated using 3D printing, and it incorporates a guitar string along its main body. At the bottom of the string, there is a parallel LED strip. The motor, sound system, and mainboard are situated within the right and bottom compartments. To facilitate grip, the underside of the frame is enclosed using a wooden board.
(1)
Input Detection. We employed a technology solution combining capacitive touch and an infrared distance sensor. The touch pins of the ESP32 are directly linked to the strings to detect finger contact with the strings. An infrared distance sensor is positioned above the end of the strings on one side to ascertain finger positioning.
Additionally, this method has precision limitations, as the monitored distance values can be influenced by the force of finger pressure on the strings or the orientation of the fingers. However, as a probe, it suffices for basic musical and social functionalities. To reduce the confusion caused by detection, we provide visual cues via lighting, to help users understand how to produce the right sounds with specific techniques on the strings.
(2)
Sound. We utilized Max/Msp integrated with Ableton Lite 11 for sound synthesis. Sound mapping was achieved through Max/Msp and could be connected to a Bluetooth audio module on the device, allowing the mapped sounds to be played through the device’s speaker.
(3)
Light. We added an LED strip parallel to the string with eight LEDs representing octaves. Originally, it visualized inputs from other devices, but now it also shows users if their input is recognized.
(4)
Vibration. We integrated a motor into the probe to pluck the string, controlled by a plastic pick attached to the motor’s shaft. This design was inspired by the Awareness System, aiming to enhance social presence by physically generating dynamic interactions.

4.4.2 CDMIs Network.

Figure 4:
Figure 4: Network Structure of Three ComString Devices
The three instruments together form a concerto system, depicted in Figure 4. Communication is facilitated using four ESP32 devices: one acts as a server and Wi-Fi Access Point (AP), while the other three serve as main control boards for the devices. They communicate with the server via UDP, exchanging input data (string touch status and infrared distance values) and receiving output instructions (LED colors and positions, motor actions). The server also communicates with a PC through Serial Communication, forwarding input values from the devices to Max/MSP on the PC for sound mapping.
To streamline development and achieve our design considerations effectively, we decided to centralize sound playback on the computer. This choice was made considering that Max/MSP on a single PC can only play sound through one Bluetooth device at a time and to address latency issues. This configuration provides a unified sound source for all devices, enhancing connectivity potential and allowing us to explore the impact on users when the sound deviates from the conventional instrument-based playing experience.

5 Stage 2: Co-design with Therapists

After completing the initial version of two ComString devices, we brought them to collaborate with music therapists, aiming to understand the potential of such CDMIs in stress-reducing groups from an expert’s perspective. We consulted with therapist K first, obtaining suggestions around ComString, and then engaged in in-depth co-design with therapist Z to shape the final session plan.

5.1 Use ComString for Therapeutic Goals

In actual music therapy practice, the value of an instrument is contingent upon its therapeutic goals. When we introduced ComString to the therapists, they were initially puzzled by the device’s unclear purpose. However, from the designer’s perspective, the device’s explicit purpose serves as a starting point for therapists to provide input. Therefore, ample room for discussion was left open. As the discussion progressed and they grasped the device’s potential, they provided the following directions for further expansion.

5.1.1 Music Dialogue.

The first direction involves leveraging the advantages of digital instruments with theoretical support to enhance their musical capabilities, thereby facilitating visitors in establishing dialogues within the realm of music.
Specifically, therapists contend that they are unable to discern the benefits of substituting traditional instruments with a digital string instrument capable of producing only an octave. This is attributed to the limited range of an octave, and the inability of digital instruments to rival traditional ones in timbre, thereby struggling to generate overtones. They express a preference for instruments with superior sound quality, capable of producing rich overtones upon striking, as these overtones themselves induce feelings of comfort and relaxation, yielding the intended effects.
However, if this device were capable of producing chords straightforwardly, it could serve as the point of departure for surpassing traditional instruments. tK asserts that playing chords can significantly ease the establishment of musical dialogues: 1) chord progressions (successively played combinations of chords) can evoke impressions of a song. When one person plays a chord, others can more easily discern how to respond with a chord. 2) chords are more easily harmonized with songs. A common approach within a group is to allow visitors to spontaneously play instruments in harmony with a song and engage in collective singing. If ComString can provide several commonly used chords from popular songs, it can serve as an accompaniment for many songs, adapting to various group activities. 3) it is even possible to pre-set the chord progressions for specific songs, ensuring that visitors, regardless of their manner of playing, will produce the chords of that particular song. This enables a straightforward and seamless alignment with the chord progressions of the song, fulfilling individual expectations for music. As tK said: "The human body inherently holds expectations for music; changes in the intensity and pitch of the chords can instill musical anticipations within your body. When it meets those expectations, you experience a particularly positive sense of satisfaction and happiness. This is precisely the pinnacle experience that music bestows upon you."

5.1.2 Music Game.

The second direction involves the integration of CDMIs with group gaming activities, utilizing sound as a gaming element while attenuating its musical quality. In musical groups, group games are also employed to foster a social atmosphere. By combining the device’s musical co-performative capabilities with such activities, it may yield novel and entertaining social music games. For instance, based on ComString’s real-time capacity to visually depict finger positions through illumination, a therapist proposes a follow-and-mimic game: one person holds the device and glides along the strings, producing a sequence of tones. Others can discern their performance path through the illumination, enabling them to mimic the motions in an attempt to replicate the same sound.

5.1.3 Music Targeted Therapy.

The third direction is related to functional therapy scenarios. Therapists have also proposed using our instrument prototype as a targeted therapeutic device for specific groups. For example, it can be used for Parkinson’s patients to achieve therapeutic effects by gradually refining finger pressing movements on the strings, with sound feedback as an indicator of progress, thereby progressing from gross motor training to fine motor control. For patients with cognitive impairments such as Alzheimer’s disease, the device can be set up to guide finger movements with continuous actions, guided by lights, to facilitate coordination between hand and cognitive control. To achieve such effects, it is especially crucial to appropriate difficulty levels. Starting with entry-level rules, the difficulty should be gradually increased to avoid boredom from long-term usage. The goal is to engage users to the point where they become "fans" of the therapy and willingly participate in further sessions.
Figure 5:
Figure 5: The 5 sessions of music therapy, with ComString being used in different forms during Sessions 2 to 4. The right column describes interactive rules of ComString in Sessions 2 to 4.

5.2 Group Plan with ComString

As our target users are the general public in stress-reducing groups, we integrated tK’s suggestions in both ’music dialogue’ and ’music game.’ Building upon this, we created additional interaction proposals to enhance communication with tZ. Throughout the process, we highlighted the stress-reducing group scenario, guiding tZ to explore more universally applicable therapeutic techniques based on ComString. Ultimately, a comprehensive plan for group therapy sessions was finalized.

5.2.1 Final Plan of Therapy Sessions.

As shown in Figure 5, the plan consists of 5 sessions, with S2-S4 involving the ComString. Since only 2-3 people will use ComString at the same time, in these sessions others engage in different ways. The therapist will lead the sessions based on the plan, and the S2-4 could be executed multiple times to provide every participant with the opportunity to use ComString in each session. However, the final execution will still depend on the therapist’s assessment of the on-site situation.
Figure 5 describes the involvement of the therapist, participants, and ComStrings in each session, outlining their roles and activities, S1&S5 are traditional sessions in music therapy, S2-S4 are designed around the ComString. S2: In groups of three, the three people holding the ComString follow the device’s lighted guide to play the chord progression of the canon, and then the therapist joins in with his or her own vocals. The other three without ComString join in as a chorus. S3: Groups of three are divided into playing and imitating groups. The playing group improvises with their ComString, while the mimicking group imitates the sounds they hear with their movements. S4: Seven people sit in a circle, and two participants each control an ambient sound, collaborating to adjust the volume level to create a scene. The therapist will then choose a piece of music to play according to the ambient sound, the participant controlling the ambient sound will continue to cooperate, while the others just need to listen quietly. At the end of the session, everyone shares their experience of the sound.

5.2.2 Interactive Rules of ComString.

Adapting to S2-4, maintaining the design of using LED and vibration to disclose information about others, we iterate the interaction rules of ComString as follows:
In S2, a guiding light and mandatory coordination rules are added. Red lights would guide the user of three ComString devices to play one note of a triad each, and only when all three notes are correctly touched simultaneously, will a triad sound be produced.
In S3, the ComString uses a single-tone mapping mode that corresponds to the octave pitch of instruments with different timbres, which are played independently by the users.
In S4, the ComString uses a continuous mapping mode that corresponds to the volume of ambient sounds. Users can slide their fingers on the strings to continuously control the volume of the selected sound, with the bottom position or released finger resulting in silence.
Additionally, the music used in S1 is created by the tZ; In S5, 12 traditional Orff instruments are provided for the participants to select freely.

6 Stage 3: Therapy Group Practice

In stage 3, we held a therapy practice based on the plan shown in Figure 5, and this section documents the results, including the process of the practice and the analysis of the therapist and participants.
Figure 6:
Figure 6: Timeline of the practice of therapy sessions. The pie charts show the therapist’s behavior in each session.

6.1 Process of the Practice

The actual process conducted is shown in Figure 6, in which every session is executed smoothly as scheduled, except for S2. The unexpected occurrence in S2 can be attributed to insufficient availability of the ComString, which makes continuously triggering the chords based on the rule a great challenge for the three participants who encountered the probe for the first time, even with explanations and advice from the researchers. Faced with such difficulty in S2, the first three participants discussed a way to break through. Because it was difficult for all three of them to press the correct chord positions simultaneously, they decided that one person would play the chords as previously planned, while the other two continuously slid their fingers on the strings to increase the chances of triggering the correct positions. However, relatively this strategy would create new barriers to playing the chords continuously with rhythm. Therefore, when the ComString was handed over to the next three participants, the therapist changed the rules of the activity temporarily. For the new rule, the therapist would play chords on a MIDI keyboard and sing, leading everyone to directly join in with their voices, and the last three participants could continue to try to trigger chords together or freely explore the instruments. Actually, after witnessing the failures of the previous participants, they only made superficial attempts at collaboration. Instead, they mostly treated the device as a novelty and explored it freely on their own.
S4 was most appreciated, where two participants cooperated to control an ambient sound, combined with the background music selected by the therapist, creating a unique sense of atmosphere. Each person may have distinctive visual imaginations and atmospheric experiences according to his experiences or characteristics, but with a general common point of feeling deeply healed. In this relaxing atmosphere, reminiscent of sitting around a campfire and having a heart-to-heart conversation, all six participants voluntarily shared the images that emerged in their minds, sparking discussions on the reason for such a healing effect of the step.

6.2 Therapist Behavior

Excluding behaviors that simply promote process or remain unrelated to the participants, we have categorized four distinct therapist behaviors based on different purposes and impact on the group:
Instructing: To inform participants of what they need to do in the upcoming session, therapists give clear commands or instructions. This influences the roles and task assignments of the participants, specifies constraints and rules, and facilitates the development of the group within requirements.
Supporting: Participants come up with a goal, and the therapist provides support to help them achieve it.
Guiding: Therapists have a desired goal in mind and guide participants to find suitable ways toward it. This behavior emphasizes cooperation and collaboration with the participants.
Intervening: Therapists initiate immediate actions that cause changes within the group, exerting a stronger influence over the participants directly.
The instructing behavior usually occurs at the beginning of a session, including activities such as grouping, allocating musical instruments, and explicitly instructing participants to follow some rules during the session. In terms of guiding and supporting, they constitute the majority of therapist behaviors from the perspective of the entire group, and when examining individual sessions, there is a clear tendency towards either guiding or supporting being the primary focus. On the other hand, intervening is a more assertive behavior where therapists directly influence the participants based on their judgments. This was mainly observed in S4, where the therapist intervened by playing music and altering the atmosphere created by participants, thereby changing their emotional experiences. The targets of guiding and supporting are more diverse, so the following subtopics will describe these two main themes based on the objects of the behaviors.

6.2.1 Supporting.

Supporting can be provided through coordination, clarification, balance, and encouragement. Such behaviors will not influence the direction of participants’ actions but rather provide support in alignment with their expectations. Coordination with participants, for example, in S2, the therapist would adapt her singing to match the participants’ rhythm instead of asking the participants to adjust theirs. Clarification involves answering questions related to rules or music theory, helping participants achieve their goals smoothly. For example, in S3, when the performer couldn’t distinguish pitch during the interaction, the therapist provided answers. Balance means striving to make everyone feel satisfied. For example, in S3, the therapist balanced the performances between groups and controlled the number of interactions to ultimately achieve a draw. Encouragement entails affirming individual performances and giving praise during the interaction. For example, the therapist encouraged those who were hesitant to sing by "you sang already very well" in S2, and praised participants to have "excellent imitation of actions" in S3.

6.2.2 Guiding.

Guiding aims to lead participants’ actions in the direction expected by the therapist, promoting everyone to experience, express, reflect, or interact. Guiding participants to experience: This is most prominent in S1, where the therapist continuously inserted guiding phrases to help everyone become aware of their current bodily state and relax, which included "the contact between your body and the ground", "take a deep breath", "notice your fingertips" etc. Guiding expression: The therapist would use music, gestures, language, etc. to guide participants to freely express themselves through music or their bodies. For example, in S2 everyone was guided to sing, which was a means of expressing emotions through both the body and music, and similarly, in S5 embodied instruments were used for expressing. Guiding reflection: Towards the end of each session, the therapist usually asked about everyone’s feelings or the logic behind behaviors. For example, in S3, the therapist asked the imitation group to explain the design of their movements, and in S4, everyone reflected on their feelings and whether they felt comfortable during the preceding sound. When there was a rich sense of feelings among participants, establishing a suitable atmosphere for communication, the therapist may further guide deeper emotional reflection. Guiding interaction: The therapist would use language to promote communication between participants. For example, in S3, she asked the expressive group if they could understand the meaning behind the movements of the imitation group.
Consequently, from the proportions of therapist behaviors in different sessions shown in Figure 6, it is evident that therapy behaviors exhibit distinct tendencies based on session designs. In sessions with clear tasks and more social actions (S2&S3), therapists are more inclined to offer support and encourage participants to take autonomous actions. Conversely, in sessions with open-ended tasks (S1&S4&S5), therapists take a more proactive role in guiding or intervening, exerting a stronger influence on participants’ experiences.

6.3 Participates Experience

We recruited six designers or researchers who expressed willingness to relax due to the stress recently experienced to participate in the therapy group, refer the Table1. These individuals are labeled A1-A4 and B, C based on their interpersonal relationships. A1, A2, A3, and A4 know each other before, while B and C are strangers to the other five. A1, A2, and A3 formed Group A1, while B, C, and A4 formed another Group B. Before the therapy, we used the HADS scale to assess their anxiety and depression levels over the past week. The results showed that A1, B, and C had recently experienced high levels of stress, while A2, A3, and A4 were not affected at a level yet.
Table 1:
 A1A2A3A4BC
Anxiety11682157
Depression8333911
Table 1: Results of HADS Scale
The therapeutic approach employed in this group involves allowing participants to express, feel, and understand their emotions during music activities. According to tZ, "A therapist anticipates a positive outcome where clients can explore new spiritual experiences in the group, promote self-growth, and find support in facing everyday events with greater strength." Emotional experiences are personalized, and the commonality in group therapy lies in each person’s ability to express and feel their unique emotions in their own way. Through participants’ emotional experiences, we can observe their main gains and sources of therapeutic benefits in the group. We have compiled these emotional experiences and summarized them into two basic emotions (positive and negative) and two more complex ones. Finally, we identified four themes related to emotional experiences in this study. Although the richness and depth of these feelings may go beyond the scope of words, this paper tries to present the rich experiences behind through detailed descriptions as much as possible.

6.3.1 Extraordinary Experience.

C emphasized a unique sense of "positive sense of uncertainty". Uncertainty usually brings her relative negative emotions such as anxiety or depression in daily life, but unexpectedly it brought about positive feelings in this activity. The therapist evaluates such experience as an ideal type of effect music therapy aims to achieve, which is a new experience that breaks away from habitual patterns, gradually transforming situations that were once feared or worrisome into harmless and positive ones. In our case, when initially exploring the digital instruments, the unfamiliarity of these devices brought C a sense of uncertainty. However, instead of causing hesitation, it energized and motivated her to try mastering them. Similarly, during S4, when collaborating to control the ambient music with B, the sense of uncertainty increased and triggered concerns due to the openness of the task. However, she discovered that B’s rain sounds changed in sync with her campfire sound, experiencing B’s cooperation in creating a narrative, which allowed her to regain a sense of positive motivation.

6.3.2 Connectedness.

We categorize the feelings of collective consciousness, unity, and belonging, etc. under the theme of "connectedness", which primarily arise from the following three types of situations. Firstly, the design of the device’s lighting: C and B were the only ones who felt a sense of connection through the warm light changes triggered by overlapping finger positions, which also left a deep impression on them. C specifically marked the moment when the lights turned yellow in the trajectory as "a sudden feeling of getting connected". A1 did notice the relationship between color changes and cooperation but did not have an obvious emotional response to it. B proposed an explanation, suggesting that the lighting facilitated a sense of connection particularly for newcomers like C, in contrast to Group A members who were already well-acquainted. Moreover, B, while collaborating with C in S4, interpreted C’s actions based on the movement of white LEDs, which represented C’s finger movements. This enhanced her enjoyment of creating the scene "together" with C. Secondly, completing tasks with a common goal: Both A4 and B mentioned feeling a strong sense of connectedness during the S3. A4 explained, "We all unite to accomplish a task," while B also stated, "We become particularly united when there’s a goal to overcome." Thirdly, feelings and associations triggered by music: Both A1 and C mentioned a sense of connectivity during the S4. A1, while listening to the ambient music for the first time, imagined a scene of inviting friends to gather at her home and noted the variations in rain and campfire sounds, saying "We were originally having a campfire outdoors, but then it started raining, so we moved inside by the fireplace to listen to the rain and chat." In contrast, during the second round of ambient music, C described a scene of herself drying clothes in the forest, and with the accompanying music played by the therapist, the scene transformed into one where a magical friend appeared in the forest, bringing along a story of new friends, feeling "a shift from loneliness to warmth".

6.3.3 Positive Emotions.

"Relaxation, happiness, positiveness, and vitality" are frequently mentioned words by the participants to describe positive experiences during this therapy group. In addition to the positive emotions elicited by the successful achievement of the task’s original design goals (e.g. S1 inducing relaxation), the performance details of the probes and the interactions with others can also bring additional beneficial feelings. During the S4, the probe provided a subjective tangible interaction for controlling the ambient sounds. This unique control set it apart from simply playing a recorded sound sample and was considered as the key reason for "a more immersive experience". A1 believed that this control made the ambient sound fluctuate in volume a bit unpredictably, showing a stronger sense of realism. On the other hand, B felt that it was because they knew "someone is in control", creating a sense of "being led and enhancing immersion". The encouragement, understanding, and active participation of others are also important factors that contribute to positive emotions. For example, A1 felt motivated and inspired by the therapist’s encouragement, while B expressed great joy when the therapist understood the meaning behind their imitated actions. Additionally, the active attitude of Group B during the S3 had a contagious effect on Group A1, making all three of them feel that "others were engaged, which motivated and brought me joy (A2)".

6.3.4 Negative Emotions.

"Tension, embarrassment, and anxiety" are the most common negative emotions experienced by the participants, primarily stemming from the usability of the devices and concerns about their performance. As a typical example, the inability to control the devices caused great frustration and anxiety for Group A in the S2. Additionally, explicit tasks can also enhance worries, with participants fearing embarrassment if unable to complete them, resulting in feelings of awkwardness. For instance, A3 mentioned, "It feels like I don’t know how to use this thing, naturally making me worry that others might think I’m dumb for not even knowing how to play with it. " During the S4 when A2 controlled the intensity of the ambient music, they expressed, "Because I bear the responsibility of creating an environment for others, it feels burdensome, afraid of providing a negative experience for everyone." In S5, when everyone’s rhythm stabilized, they might still hesitate, saying, "I want to change the rhythm, but I don’t know what to do. I’m afraid of disrupting the order, which is a bit awkward."

6.4 The Retrospective Exploration with the Therapist

Following the conclusion of the group practice, we conducted an in-depth retrospective study and interview session in collaboration with the tZ. Employing a retrospective approach, we retraced our steps from the recent conclusion of practice back to the design of ComString, aiming to unearth any unconventional and unrecorded intriguing findings. This encompassed various aspects such as the forms of applied therapy techniques, notable participant behaviors, therapist emotional fluctuations, and novel conceptualizations of musical instruments.
Regarding participant behaviors, we cataloged both positive and negative manifestations across various sessions of therapy, with the therapist providing profound insights. The therapist encouraged us to contemplate the dual nature of participant behaviors. The therapist mentioned, "During the initial ice-breaking phase, some clients exhibited verbosity, which may not necessarily signify heightened engagement but rather serve as an expression of their inner defensiveness and discomfort through verbal communication." Additionally, the therapist noted that some clients possessed extroverted personalities while others leaned towards introversion. When extroverted clients display conspicuous signs of frustration, it could potentially prompt introverted clients to find a degree of psychological equilibrium. Numerous other participant behaviors were observed, but such discoveries are challenging for computers to capture and document comprehensively, relying instead on the therapist’s expertise for summarization and transmission.
During the retrospective exploration of the phase involving musical instrument learning in music therapy, we conveyed certain clients’ inner psychological experiences to the therapist. Client A mentioned,"I felt anxious at the beginning of learning the instrument because I didn’t understand music theory, but the therapist patiently explained things, which made me develop a sense of reliance and respect for the therapist." The therapist expressed gratification and stated, "In music therapy, I indeed aspire to receive positive encouragement from clients, aiming to establish a mutually beneficial emotional feedback loop between myself and the clients. In this ongoing process, I heal them, and they, in turn, uplift me!"
The therapist expressed approval of CDMIs. She had limited exposure to DMIs in previous therapies but found the ComString system highly inspiring. She initially stated, "The interactivity offered by ComString is a significant source of inspiration for me, and it has proven to be effective in fostering emotional connections within social groups." Additionally, the therapist mentioned, "DMIs can provide a broader range of musical genres, such as natural sounds, environmental sounds, and even synthetic music, which can enrich the therapist’s toolkit." Furthermore, we discussed potential future directions in the design of CDMIs. We inquired whether the therapist had frequently needed to exercise control over individual or group music performances in previous therapies. The therapist affirmed this and pointed out, "On one hand, I aim to enhance the musical experience through my control, ensuring client satisfaction; on the other hand, I aspire to offer a more diverse array of musical materials."
In the realm of music therapy interventions, the therapist highlighted the intricacies of dealing with diverse client emotions and underscored the gravity of the field of music therapy. Different client groups necessitate distinct therapeutic strategies. The therapist mentioned, "For clients who are emotionally sensitive, we must be vigilant about their emotional and behavioral fluctuations and exercise caution when introducing new materials and therapeutic elements to prevent extreme emotional responses." However, the therapist also noted that our work could be applied to groups such as school-age children or their families, stating, "This demographic may not necessarily seek decompression, but they can gain knowledge through music experiential learning, fostering emotional connections among themselves."
During this retrospective analysis, as HCI researchers, we did not experience a sense of mastery due to the completion of this study. Instead, we discovered that the field of music therapy offers a broader research landscape, igniting a newfound enthusiasm and drive to delve deeper into it.

7 Discussion

While the previous section focused on the ComString, the subsequent discussion adopts a broader perspective, delving into the effectiveness, opportunities, and challenges of CDMIs in the context of music therapy. This encompasses the design, people, and technology surrounding CDMIs.

7.1 The Design of CDMIs

7.1.1 Enabling users to derive therapeutic benefits from collaboration.

The collaborative strategies incorporated into ComString and the group plan developed around it did indeed result in an effective music therapy session. However, the relationship between the collaborative features, rules, and the ultimate therapeutic outcome is complex. When the device is placed in a real social setting, it is just one of the factors that influence group dynamics. We identify three factors that disturb group dynamics and facilitate the continuous flow of emotional experiences:
Group Factor: Encompasses the behaviors and emotional expressions of other participants and therapists.
Inner Factor: Refers to one’s current emotional experiences, behaviors, personality, attitude, etc.
Device Factor: Involves the feedback from the device to the user or others.
The abundance of Group and Inner factors can lead to diverse group dynamics for the same Device factor. For instance, when the frustrative learning process of ComString exacerbates users’ anxiety, users with extroverted and dominant personalities(IF) are motivated to seek assistance from others(GF), thereby facilitating quicker rapport-building and advancing the activity. At the same time, the anxiety displayed by extroverted and dominant individuals(GF), can create emotional balance for introverted and less assertive individuals, making them relax when facing the same frustrating learning process(DF). Thus promoting collaborative efforts during the activity.
From the results, it’s evident that different individuals, when facing the same phenomenon within the group, navigate towards a comfortable emotional experience through various pathways. This emphasizes the complex relationship between the collaboration capabilities of CDMIs and the ultimate experience, urging us to approach it dialectically through more practices and avoid testing the CDMIs out of the situations.
Please note that this example does not argue for frustrating design; rather, it emphasizes the importance of observing how people respond variously to a situation induced by CDMIs. After all, in most cases, impressive therapeutic feelings generally stem from positive and enriching experiences. For instance, participant C mentioned the positive uncertainty created by teammates, and in S4, the "active white noise" led to a more immersive atmosphere through attentive listening.

7.1.2 Encouraging therapists and designers to establish design consensus from collaboration.

The successful therapeutic outcomes in this study rely on effective collaboration between therapists and designers. Before formal design, it is essential to establish shared design goals, primarily to provide effective music therapy for users, and to avoid divergent priorities arising from differences in professional knowledge backgrounds. For example, therapists may prioritize the practicality of CDMIs, which can be understood as cost-effectiveness. Costs encompass the effort involved in developing new group plans, learning costs for therapists and clients using the device, and pricing. This may lead therapists to overlook the benefits of exploratory design. For instance, tK initially considered the design of using lights in ComString to display others’ finger positions as impractical and suggested adding other functional features to enhance its effectiveness. However, upon understanding the purpose of increasing social presence, he accepted it, ultimately proving its positive impact. Therefore, finding a balance between the expectations of both parties is crucial in the co-design process.
Secondly, pre-understanding each other’s knowledge is a necessary process. Designers’ pre-understanding can extract core, valuable insights from the complex knowledge system of music therapy. It’s also important to understand the behavioral characteristics of the therapists we collaborate with. In a deep co-creation process, we can not only see how therapists design group plans but also observe how therapists guide the group process. The therapists’ behavioral characteristics are summarized in Section 6.2 and the content of this section can provide valuable references for future design research.

7.2 The People of CDMIs

7.2.1 Pay attention to how therapists feel.

In this co-design, we also observed that therapists’ needs beyond achieving therapeutic goals in group settings are seldom recognized. As pivotal figures in music groups, therapists are constantly monitoring the progress of sessions and intervening as needed to facilitate emotional well-being among clients. Throughout each phase of the process, therapists also anticipate receiving feedback, both positive and constructive, from clients, fostering a positive emotional cycle and motivating therapists. In this particular task, the CDMIs serve as a collaborative tool, involving therapists in the teaching of musical instruments. This fosters emotional dependence and trust between clients and therapists. Additionally, therapists express their desire for access to new knowledge and resources, including interactive and diverse music, as well as tools and activities that empower them to enhance the therapy practice.

7.2.2 Diversity of therapeutic groups.

We classified groups based on the mental health level of clients in section 2.1. Similarly, therapists differentiate stress-reducing groups from serious music therapy, defining it as non-serious, experiential activities. They emphasize the diversity and sensitivity of clientele, calling for caution in dealing with them. In the field of HCI related to music therapy, there is no shortage of work focused on specific users, as we have already mentioned in the related work. The approach of such research to address explicit illness was different from the one presented in this paper, and it would be inappropriate to directly apply the design knowledge from this paper to non-universal groups.
People in universal groups are also important clients for music therapy, deserving the attention of HCI researchers. The study participants in this work represent a specific demographic of working professionals exposed to high-stress work or academic environments. They seek therapy with low economic and learning barriers, emphasizing emotional well-being over strong therapeutic goals. Open to diverse therapeutic conditions, they anticipate engaging in novel collaborative tasks outside their regular settings to gain new knowledge and experience mental relaxation while fostering friendships. These behavioral and psychological characteristics can be extrapolated to similar groups. Faced with such people who have no clear symptoms but are very likely to be in a sub-healthy state, our approach to exploring the therapeutic usages of CDMIs can be a reference for enhancing universal group therapy. This approach does not aim to improve the therapeutic effects of specific illnesses, but to seek creative ways to bring novel experiences with therapeutic effects.

7.3 The collaborative technologies of CDMIs

This paper demonstrates the effectiveness of employing design strategies that enhance collocated social interaction to design CDMIs. Other design strategies[47, 62] not utilized in this study hold the potential for benefiting group music therapy. Building upon this foundation and drawing from experiences in the co-design process, we propose the following design strategies tailored for universal group music therapy.

7.3.1 Towards Easy Ensemble.

One strategy is to focus on ensemble playing in group music therapy, using CDMIs to simplify the barrier to playing (S2), enriching interactive modes of collaboration (S4), making it easier for clients to collaborate with others and to create meaningful musical experiences that contribute to therapeutic effects. Music therapy emphasizes the focus on self-growth rather than producing masterpieces[60]. Therefore, music serves as a communication medium beyond language, enabling dialogue among group members.
Music dialogue, on the one hand, emerges from the instinctive expression of emotions. Clients can strike, pluck, or shake instruments to produce sounds as a form of self-expression, responding to the sounds of others.In this case, CDMIs can be used to enhance self-expression and balance group expression. By integrating technologies like public displays, visual elements in the space or on the surface of instruments can be presented, creating an atmosphere that facilitates easier and more abundant self-expression and enhances social presence for others[23, 35, 55]. Serving as a form of unhuman facilitator, it assists therapists in observing the group, balancing expression weights, and guiding individuals to express themselves boldly through music[12, 32].
On the other hand, musical dialogue can be supported by music theory. The distinction between a single note and a chord mentioned by tK can help illustrate this point. Similar to organized sentences supported by grammar, chords are like words in a musical sentence, while single notes are like individual letters. Therefore, instruments that can directly play chords make it easier to establish a musical dialogue and facilitate therapists’ understanding and empathy with clients. In such cases, the interaction modes of CDMIs can be designed based on music theory, similar to S2, allowing clients to harmoniously ensemble without music literacy.
In addition, sharing one’s experiences in ensemble playing is another form of dialogue. As in S4, creating a musical atmosphere that sparks imagination is beneficial for clients’ associations and expressions, allowing for even deeper sharing of individual memories and expectations.

7.3.2 Towards Open-ended Play.

The second is focusing on playfulness in social interaction. We can design CDMIs with interaction rules for open-ended play in which social interaction is an important aspect. Open-ended play is play with no pre-determined limitations or "right" or "wrong" set outcomes, just like the design of S3. This topic has been well-developed in the field of interaction design, with a wealth of experiences to draw upon, including interaction models[67], design frameworks[59], and valid playful elements(e.g. proxemics[28], body movements[59], and playground[42]). For our design objectives, key aspects include leveraging sound as a crucial gaming element, integrating it with CDMI, and, with the assistance of therapists, incorporating it into the therapeutic context.
Using the enrichment of S3 as an example, we draw on experiences from open-ended play. For instance, we can integrate the activity space with ComString’s sound output, allowing performers to play and store sounds in different locations. Imitators can then travel the room to mimic the sounds stored by performers. Additionally, clients can introduce new rules, such as allowing them to exchange roles between performer and imitator when certain conditions are met.

7.3.3 Towards Democratization.

The third aspect involves empowering therapists and clients with the ability to use and modify digital therapeutic tools, aiming to design more democratized CDMIs. Democratization of technology comes with the advantages of low learning costs and rapid development, enabling a broader implementation of practices based on a unified toolkit.
On one hand, we can develop modular toolkits for therapists, enabling them to use materials from the toolkit to create different functionalities of CDMIs or to modify traditional instruments. The toolkit can include digital foundational components including basic sensors and actuators, new materials in HCI[7, 58], and customized instruments[39, 48].
On the other hand, democratizing and packaging cutting-edge technologies with the potential to assist music therapy can increase accessibility to these emerging tools. For example, encapsulating advanced technologies like AI in a therapist-friendly form allows them to use AI tools according to their needs[25, 36, 74].

7.4 Group Art Therapy and HCI

In social groups such as music therapy groups, or therapy groups using other art forms, the closed and secure social atmosphere often facilitates genuine emotional experiences. In the process of designing interactive technologies for music therapy groups, we have observed the uniqueness of this setting in eliciting emotional changes in clients and therapists. Most of the time, their emotional expressions are tacit and silent. Clients’ emotional expressions appear to be purer, as they break free from the constraints of everyday life, social norms, rules, and identities. The art therapy group as a research area or scenario is well suited for HCI researchers to explore how technology and socio-emotional experiences interact.
Incorporating insights from group art therapy into HCI can contribute to more emotionally engaging technologies, by understanding the nature of communication, collaboration, and emotional expression inherent in it.

8 Limitation

The main limitation of this case study is the usability of the technology probe. Although the infrared distance measurement technology solution reduces the cost, it also significantly increases the instability of the equipment, making it less controllable in therapy group compared to original laboratory settings. Such lack of usability disrupted the group process, preventing an understanding of participants’ experiences under normal situations. But on the contrary, this unexpected outcome allowed us to observe how therapists and participants face unexpected events, coping with high-difficulty tasks. Additionally, the lighting design as the awareness system had shortcomings, which caused confusion and interference when combined with other guiding functions, even with different light colors. Only when the function was simple enough and the lights served no other purpose, did others’ light cues appear effective.
As an attempt to apply CDMIs to a therapy group setting, this study only involved two therapists and implemented one set of group activities, resulting in limited observed phenomena, and thus more observations and records are needed for fully supporting our findings. Furthermore, the focus was solely on experiential music groups, and whether the specific experiences of such groups can be generalized to other types remains to be investigated. Nevertheless, as a complete exploration, we believe this collaborative process can provide some valuable references for designers interested in this topic.

9 Conclusion

This study explores the integration of Collaborative Digital Musical Instruments (CDMIs) into group music therapy. Through collaboration with therapists, we explored ComString as a design probe to highlight strategies for enhancing social interaction and reducing pressure in basic string-based CDMIs.
Observations revealed four therapist behaviors - instructing, supporting, guiding, and intervening - linked to session design. Participants had personalized experiences during sessions but unanimously appreciated the ’active’ ambient sound of CDMIs, promoting comfort and healing.
The research questions we addressed concerning the design of CDMIs in the group music therapy setting, including 1) how to design CDMIs, 2) how therapists and clients use CDMIs, and 3) what stakeholders gain from applying CDMIs. In response to these questions, our answers are:
(1)
Design CDMIs for a music group require designers to closely co-create with therapists. The design case of ComString is one example of various potential ways. Based on the distinction between serious music therapy and stress-relaxing groups, our approach is less constrained and open to playfulness and creativity rather than addressing specific problems raised from a special illness. Several design strategies are proposed to help further design research on CDMIs in music group contexts, including design towards easy ensemble, social play, and democratization.
(2)
Therapists utilize CDMIs as tools to apply their therapy techniques, expecting practicality and ease of adaptation to their therapy sessions. Clients have personalized experiences with the use of CDMIs, inseparable from the session design. It is certain that they benefit from the social aspects of CDMIs.
(3)
Applying CDMIs in group music therapy sessions benefits stakeholders in different ways. For therapists, CDMIs provide alternative tools; for clients, they bring novel experiences; for designers, CDMIs evoke a pathway to aid music therapy and promote human well-being. They also serve as a field for researching shared interactions and emotional communication.
This paper raises the open question of the efficacy of CDMIs in therapeutic groups, and we look forward to further design practice and discussion to promote human well-being.

Footnotes

Supplemental Material

MP4 File - Video Preview
Video Preview
MP4 File - Video Presentation
Video Presentation
Transcript for: Video Presentation
MP4 File - Video Figure
Video Figure

References

[1]
Sonja Aalbers, Marinus Spreen, Kim Pattiselanno, Peter Verboon, Annemieke Vink, and Susan van Hooren. 2020. Efficacy of emotion-regulating improvisational music therapy to reduce depressive symptoms in young adult students: A multiple-case study design. The Arts in Psychotherapy 71 (2020), 101720.
[2]
Kat R Agres, Rebecca S Schaefer, Anja Volk, Susan van Hooren, Andre Holzapfel, Simone Dalla Bella, Meinard Müller, Martina De Witte, Dorien Herremans, Rafael Ramirez Melendez, 2021. Music, computing, and health: a roadmap for the current and future roles of music technology for health care and well-being. Music & Science 4 (2021), 2059204321997709.
[3]
American Music Therapy Association 2020. Definition and quotes about music therapy. American Music Therapy Association. https://www. musictherapy. org/about/quotes (2020).
[4]
Ben Bengler and Nick Bryan-Kinns. 2013. Designing collaborative musical experiences for broad audiences. In Proceedings of the 9th ACM Conference on Creativity & Cognition. 234–242.
[5]
Diana Boer and Amina Abubakar. 2014. Music listening in families and peer groups: benefits for young people’s social cohesion and emotional well-being across four cultures. Frontiers in psychology 5 (2014), 392.
[6]
Birgitta Cappelen and Anders-Petter Andersson. 2016. Health improving multi-sensorial and musical environments. In Proceedings of the Audio Mostly 2016. 178–185.
[7]
Angela Chang and Hiroshi Ishii. 2007. Zstretch: A Stretchy Fabric Music Controller. In Proceedings of the 7th International Conference on New Interfaces for Musical Expression (New York, New York) (NIME ’07). Association for Computing Machinery, New York, NY, USA, 46–49. https://doi.org/10.1145/1279740.1279746
[8]
Hsin Chu, Chyn-Yng Yang, Yu Lin, Keng-Liang Ou, Tso-Ying Lee, Anthony Paul O’Brien, and Kuei-Ru Chou. 2014. The impact of group music therapy on depression and cognition in elderly persons with dementia: a randomized controlled study. Biological research for Nursing 16, 2 (2014), 209–217.
[9]
Franceli L Cibrian, Oscar Peña, Deysi Ortega, and Monica Tentori. 2017. BendableSound: An elastic multisensory surface using touch-based interactions to assist children with severe autism during music therapy. International Journal of Human-Computer Studies 107 (2017), 22–37.
[10]
Alexander Hew Dale Crooke and Katrina Skewes Mcferran. 2019. Improvising using beat making technologies in music therapy with young people. Music Therapy Perspectives 37, 1 (2019), 55–64.
[11]
Ian Cross. 2014. Music and communication in music psychology. Psychology of music 42, 6 (2014), 809–819.
[12]
Ella Dagan, Elena Márquez Segura, Miguel Flores, and Katherine Isbister. 2018. ’Not Too Much, Not Too Little’Wearables For Group Discussions. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems. 1–6.
[13]
Martina de Witte, Ana da Silva Pinho, Geert-Jan Stams, Xavier Moonen, Arjan ER Bos, and Susan van Hooren. 2022. Music therapy for stress reduction: a systematic review and meta-analysis. Health Psychology Review 16, 1 (2022), 134–159.
[14]
Cochavit Elefant, Felicity A Baker, Meir Lotan, Simen Krogstie Lagesen, and Geir Olve Skeie. 2012. The effect of group music therapy on mood, speech, and singing in individuals with Parkinson’s disease—A feasibility study. Journal of music therapy 49, 3 (2012), 278–302.
[15]
Stu Favilla and Sonja Pedell. 2013. Touch screen ensemble music: collaborative interaction for older people with dementia. In Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration. 481–484.
[16]
Sidney S Fels, Linda T Kaastra, Sachiyo Takahashi, and Graeme McCaig. 2004. Evolving Tooka: from Experiment to Instrument. In NIME, Vol. 4. 1–6.
[17]
Mark Finch, Susan LeMessurier Quinn, and Ellen Waterman. 2016. Improvisation, adaptability, and collaboration: Using AUMI in community music therapy. In Voices: A World Forum for Music Therapy, Vol. 16.
[18]
Christopher Frayling. 1994. Research in art and design (Royal College of Art Research Papers, vol 1, no 1, 1993/4). (1994).
[19]
Emma Frid. 2018. Accessible digital musical instruments-a survey of inclusive instruments. In Proceedings of the International Computer Music Conference. International Computer Music Association, 53–59.
[20]
Laura Fusar-Poli, Łucja Bieleninik, Natascia Brondino, Xi-Jing Chen, and Christian Gold. 2018. The effect of music therapy on cognitive functions in patients with dementia: a systematic review and meta-analysis. Aging & Mental Health 22, 9 (2018), 1103–1112.
[21]
William Gaver. 2012. What should we expect from research through design?. In Proceedings of the SIGCHI conference on human factors in computing systems. 937–946.
[22]
Denise Grocke, Sidney Bloch, and David Castle. 2009. The effect of group music therapy on quality of life for participants living with a severe and enduring mental illness. Journal of music therapy 46, 2 (2009), 90–104.
[23]
Ge Guo, Gilly Leshed, and Keith Evan Green. 2023. “I normally wouldn’t talk with strangers”: Introducing a Socio-Spatial Interface for Fostering Togetherness Between Strangers. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. 1–20.
[24]
Nicole D Hahna, Susan Hadley, Vern H Miller, and Michelle Bonaventura. 2012. Music technology usage in music therapy: A survey of practice. The Arts in Psychotherapy 39, 5 (2012), 456–464.
[25]
Billy Harris and Martha Summa-Chadwick. 2005. A computerized system for Neurologic Music Therapy. Journal of Computing Sciences in Colleges 21, 2 (2005), 250–257.
[26]
Marc Hassenzahl, Stephanie Heidecker, Kai Eckoldt, Sarah Diefenbach, and Uwe Hillmann. 2012. All you need is love: Current strategies of mediating intimate relationships through technology. ACM Transactions on Computer-Human Interaction (TOCHI) 19, 4 (2012), 1–19.
[27]
Mads Hobye and Jonas Löwgren. 2011. Touching a stranger: Designing for engaging experience in embodied interaction. International Journal of Design 5, 3 (2011), 31–48.
[28]
Amy Huggard, Anushka De Mel, Jayden Garner, Cagdas’ Chad’ Toprak, Alan Chatham, and Florian’Floyd’ Mueller. 2013. Musical embrace: exploring social awkwardness in digital games. In Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing. 725–728.
[29]
Andy Hunt, Ross Kirk, and Matt Neighbour. 2004. Multiple media interfaces for music therapy. Ieee Multimedia 11, 3 (2004), 50–58.
[30]
Hilary Hutchinson, Wendy Mackay, Bo Westerlund, Benjamin B Bederson, Allison Druin, Catherine Plaisant, Michel Beaudouin-Lafon, Stéphane Conversy, Helen Evans, Heiko Hansen, 2003. Technology probes: inspiring design for and with families. In Proceedings of the SIGCHI conference on Human factors in computing systems. 17–24.
[31]
Javier Jaimovich. 2010. Ground Me! An Interactive Sound Art Installation. In NIME. 391–394.
[32]
Pradthana Jarusriboonchai, Aris Malapaschas, and Thomas Olsson. 2016. Design and evaluation of a multi-player mobile game for icebreaking activity. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. 4366–4377.
[33]
Patrik N Juslin, Simon Liljeström, Daniel Västfjäll, Gonçalo Barradas, and Ana Silva. 2008. An experience sampling study of emotional reactions to music: listener, music, and situation.Emotion 8, 5 (2008), 668.
[34]
Pedro Kirk, Mick Grierson, Rebeka Bodak, Nick Ward, Fran Brander, Kate Kelly, Nicholas Newman, and Lauren Stewart. 2016. Motivating stroke rehabilitation through music: A feasibility study using digital musical instruments in the home. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. 1781–1785.
[35]
Lisa Kleinman, Amy Carney, and Ashley Ma. 2014. Billboard: interacting with personal public displays. In CHI’14 Extended Abstracts on Human Factors in Computing Systems. 495–498.
[36]
Zhonghua Li, Qiaoliang Xiang, Jason Hockman, Jianqing Yang, Yu Yi, Ichiro Fujinaga, and Ye Wang. 2010. A music search engine for therapeutic gait training. In Proceedings of the 18th ACM international conference on Multimedia. 627–630.
[37]
Nan Lin, Alfred Dean, and Walter M Ensel. 2013. Social support, life events, and depression. Academic Press.
[38]
Alexandra Linnemann, Jana Strahler, and Urs M Nater. 2016. The stress-reducing effect of music listening varies depending on the social context. Psychoneuroendocrinology 72 (2016), 97–105.
[39]
Joana Lobo, Soichiro Matsuda, Izumi Futamata, Ryoichi Sakuta, and Kenji Suzuki. 2019. Chimelight: Augmenting instruments in interactive music therapy for children with neurodevelopmental disorders. In Proceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility. 124–135.
[40]
Martin Ludvigsen. 2005. Designing for social use in public places–A conceptual framework of social interaction. Proceedings of Designing Pleasuable Products and Interfaces, DPPI 5 (2005), 389–408.
[41]
Jihui Lyu, Jingnan Zhang, Haiyan Mu, Wenjie Li, Mei Champ, Qian Xiong, Tian Gao, Lijuan Xie, Weiye Jin, Wan Yang, 2018. The effects of music therapy on cognition, psychiatric symptoms, and activities of daily living in patients with Alzheimer’s disease. Journal of Alzheimer’s disease 64, 4 (2018), 1347–1358.
[42]
Nicola Marcon. 2018. Designing a sonic interactive open-ended playground installation. https://api.semanticscholar.org/CorpusID:56245659
[43]
Louise Montello and Edgar E Coons. 1998. Effects of active versus passive group music therapy on preadolescents with emotional, learning, and behavioral disorders. Journal of music therapy 35, 1 (1998), 49–67.
[44]
Kimberly Sena Moore. 2013. A systematic review on the neural effects of music on emotion regulation: Implications for music therapy practice. Journal of music therapy 50, 3 (2013), 198–242.
[45]
Florian Mueller, Sophie Stellmach, Saul Greenberg, Andreas Dippon, Susanne Boll, Jayden Garner, Rohit Khot, Amani Naseem, and David Altimira. 2014. Proxemics play: understanding proxemics for designing digital play experiences. In Proceedings of the 2014 conference on Designing interactive systems. 533–542.
[46]
Alexander Müller-Rakow and Jochen Fuchs. 2012. The Human Skin as an Interface for Musical Expression. In NIME.
[47]
Thomas Olsson, Pradthana Jarusriboonchai, Paweł Woźniak, Susanna Paasovaara, Kaisa Väänänen, and Andrés Lucero. 2020. Technologies for enhancing collocated social interaction: review of design solutions and approaches. Computer Supported Cooperative Work (CSCW) 29 (2020), 29–83.
[48]
Daniel Orth, Clementine Thurgood, and Elise Van Den Hoven. 2020. Embodying meaningful digital media: A strategy to design for product attachment in the digital age. In Proceedings of the Fourteenth International Conference on Tangible, Embedded, and Embodied Interaction. 81–94.
[49]
Claudio Pacchetti, Francesca Mancini, Roberto Aglieri, Cira Fundarò, Emilia Martignoni, and Giuseppe Nappi. 2000. Active music therapy in Parkinson’s disease: an integrative method for motor and emotional rehabilitation. Psychosomatic medicine 62, 3 (2000), 386–393.
[50]
Elena Partesotti, Alicia Peñalba, and Jônatas Manzolli. 2018. Digital instruments and their uses in music therapy. Nordic Journal of Music Therapy 27, 5 (2018), 399–418.
[51]
Michelle Pate, Meera Rastogi, and Vittoria Daiello. 2022. Community-based art therapy and community arts. In Foundations of Art Therapy. Elsevier, 493–541.
[52]
Alicia Peñalba, María José Valles, Elena Partesotti, Rosario Castañón, M Sevillano, and R Wechsler. 2015. Types of interaction in the use of MotionComposer, a device that turns movement into sound. Proceedings of ICMEM (2015), 1–8.
[53]
Jordan S Potash, Sarah M Mann, Johanna C Martinez, Ann B Roach, and Nina M Wallace. 2016. Spectrum of art therapy practice: Systematic literature review of art therapy, 1983–2014. Art Therapy 33, 3 (2016), 119–127.
[54]
Philippa Riley, Norman Alm, and Alan Newell. 2009. An interactive tool to promote musical creativity in people with dementia. Computers in Human Behavior 25, 3 (2009), 599–608.
[55]
Yvonne Rogers and Harry Brignull. 2002. Subtle ice-breaking: encouraging socializing and interaction around a large public display. In Workshop on Public, Community. and Situated Displays, Vol. 6.
[56]
Beat Rossmy, Sonja Rümelin, and Alexander Wiethoff. 2021. StringTouch-From String Instruments towards New Interface Morphologies. In Proceedings of the Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction. 1–10.
[57]
Bert Schiettecatte and Jean Vanderdonckt. 2008. AudioCubes: a distributed cube tangible interface based on interaction range for sound design. In Proceedings of the 2nd international conference on Tangible and embedded interaction. 3–10.
[58]
Sarah Schoemann and Michael Nitsche. 2017. Needle as Input: Exploring Practice and Materiality When Crafting Becomes Computing. (2017).
[59]
Elena Márquez Segura and Katherine Isbister. 2015. Enabling co-located physical social play: A framework for design and evaluation. Game user experience evaluation (2015), 209–238.
[60]
Katrina Skewes. 2002. A review of current practice in group music therapy improvisations. British Journal of Music Therapy 16, 1 (2002), 46–55.
[61]
Karette Stensæth. 2018. Music therapy and interactive musical media in the future: Reflections on the subject-object interaction. Nordic Journal of Music Therapy 27, 4 (2018), 312–327.
[62]
Ekaterina R Stepanova, John Desnoyers-Stewart, Kristina Höök, and Bernhard E Riecke. 2022. Strategies for Fostering a Genuine Feeling of Connection in Technologically Mediated Systems. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. 1–26.
[63]
Tim Swingler. 1998. The invisible keyboard in the air: An overview of the educational, therapeutic and creative applications of the EMS Soundbeam. In 2nd European Conference for Disability, Virtual Reality & Associated Technology.
[64]
Savneet Talwar. 2016. Is there a need to redefine art therapy?, 116–118 pages.
[65]
Bronwyn Tarr, Jacques Launay, and Robin IM Dunbar. 2014. Music and social bonding:“self-other” merging and neurohormonal mechanisms. Frontiers in psychology 5 (2014), 1096.
[66]
Paul Tennent, Kristina Höök, Steve Benford, Vasiliki Tsaknaki, Anna Ståhl, Claudia Dauden Roquet, Charles Windlin, Pedro Sanches, Joe Marshall, Christine Li, 2021. Articulating soma experiences using trajectories. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–16.
[67]
Linda De Valk, Tilde Bekker, and Berry Eggen. 2015. Designing for social interaction in open-ended play environments. International Journal of Design 9 (2015), 107–120. https://api.semanticscholar.org/CorpusID:5017820
[68]
Michael Viega, Victoria Druziako, Josh Millrod, and Al Hoberman. 2023. Entering the Ambient: A Performative Collaborative Autoethnography of Music Therapists’ Improvising with Digital Music Technologies. In Voices: A World Forum for Music Therapy, Vol. 23.
[69]
Gil Weinberg, Roberto Aimi, and Kevin Jennings. 2002. The beatbug network: a rhythmic system for interdependent group collaboration. In Proceedings of the 2002 conference on New interfaces for musical expression. 1–6.
[70]
Gil Weinberg and Seum-Lim Gan. 2001. The squeezables: Toward an expressive and interdependent multi-player musical instrument. Computer Music Journal 25, 2 (2001), 37–45.
[71]
Claire M Weller and Felicity A Baker. 2011. The role of music therapy in physical rehabilitation: a systematic literature review. Nordic Journal of Music Therapy 20, 1 (2011), 43–61.
[72]
Barbara L Wheeler. 1987. Levels of therapy: The classification of music therapy goals. Music Therapy 6, 2 (1987), 39–49.
[73]
Anna Xambó, Eva Hornecker, Paul Marshall, Sergi Jordà, Chris Dobbyn, and Robin Laney. 2017. Exploring social interaction with a tangible music interface. Interacting with Computers 29, 2 (2017), 248–270.
[74]
Baixi Xing, Kejun Zhang, Lekai Zhang, Eng Keong Lua, and Shouqian Sun. 2013. Human-centric music medical therapy exploration system. In Proceedings of the 2013 ACM SIGCOMM workshop on Future human-centric multimedia networking. 3–8.
[75]
Ming Zhang, Yi Ding, Jing Zhang, Xuefeng Jiang, Nannan Xu, Lei Zhang, and Wenjie Yu. 2022. Effect of group impromptu music therapy on emotional regulation and depressive symptoms of college students: a randomized controlled study. Frontiers in Psychology 13 (2022), 851526.
[76]
John Zimmerman, Jodi Forlizzi, and Shelley Evenson. 2007. Research through design as a method for interaction design research in HCI. In Proceedings of the SIGCHI conference on Human factors in computing systems. 493–502.

Index Terms

  1. Co-designing the Collaborative Digital Musical Instruments for Group Music Therapy
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '24: Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems
    May 2024
    18961 pages
    ISBN:9798400703300
    DOI:10.1145/3613904
    This work is licensed under a Creative Commons Attribution International 4.0 License.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 11 May 2024

    Check for updates

    Badges

    Author Tags

    1. Co-design
    2. DMI
    3. collocated social interaction
    4. group music therapy
    5. multi-user musical interface
    6. stress-reducing

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    CHI '24

    Acceptance Rates

    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI '25
    CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 1,408
      Total Downloads
    • Downloads (Last 12 months)1,408
    • Downloads (Last 6 weeks)343
    Reflects downloads up to 14 Nov 2024

    Other Metrics

    Citations

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Get Access

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media