Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

1 Introduction

Amateur theatre is a highly popular and creative activity. Acting, lighting and sound, among other various elements mutually affect each other in a theatrical performance. As such, actors, actresses and staff need to move forward with preparations while communicating with each other to have a successful performance. However, that is not an easy thing to do. It is thought that amateur theatre mainly has the following three difficulties.

  • Difficulty 1. It is difficult for an actor to grasp beforehand what kind of acting there will be during the performance.

  • Difficulty 2. It is difficult for staff to picture the performance plan in their head.

  • Difficulty 3. It is difficult to share the image each performer/staff has of the performance with other people.

There has been research that supports creative activities up to now. For example, Kato et al. used pictures to develop a system for supporting story production [1]. A user provides the facial orientation and expressions of characters with pictures, then the system analogizes the relationships of characters, extrapolates the behavior of each character using a database, and makes proposals to the user. Nevertheless, there are only a few studies that were aimed to support amateur theatre. Lewis developed the Bown Virtual Theatre system, in which actors and stage art are arranged in an imaginary space with three-dimensional graphics and a stage can be checked virtually by entering lighting information [2]. Slater et al. developed a system in which actors can rehearse in a virtual reality space [3]. In addition, Horiuchi et al. developed a system in which the stage conditions and performance plan can be shared using a table top interface [4].

There have been several studies that focused on amateur theatre, in this way, but they did not aim to resolve all three of the aforementioned problems. The premise of Lewis’s system [2] was use by one person. As such, there is no assumption that there will be sharing of performance images with other people. Slater et al.’s system [3] is effective since instructions for acting are given and used by the director. However, the user cannot enter/check the lighting and acoustic performance. The premise of Horiuchi et al.’s system [4] has people surrounding a table top interface and is large-scaled and places all relevant parties in the same space. Information cannot be shared unless every member is present.

An Android application that can solve these three difficulties is being proposed here. This application has the following three functions:

  1. 1.

    The stage and stage setting are displayed with 3D graphics. Actors are represented as 3D objects in this space. As such, the user can expect to be able to rehearse while imagining the actual space even in an environment that is not the same as the actual performance.

  2. 2.

    The lighting and sound performance information is reproduced with 3D and audio functions. These will help the user think about the performance, make it easier to understand the performance plan and can be conveyed to other staff.

  3. 3.

    This information is shared with multiple devices through a network. There will be a smooth transmission of performance images among staff. In addition, the user can expect to be able to share information more easily with people who are not participating in meetings and rehearsals.

2 Application Overview

In this application, the left half of the screen will be the stage area and the right half will be the performance editing area (Fig. 1). The stage area will show 3D models. A menu button will be placed on the right end for the performance editing area and, if the user taps that, a performance editing panel which responds to functions assigned to each button will be displayed in the lower part of the performance editing area. A script will be displayed on the upper part. When editing the performance information, the user enters the view of the stage in the left-side stage area and the lighting and sound performance information in the right-side performance editing area.

2.1 Stage Area

The stage reproduced with 3D graphics will be displayed on the left side of the screen and this will be the stage area. The contents will be expressed in that area when performance information is entered in each function that will be explained later. The user can freely control the visual camera that shows the stage when dragging to this area and check the different points of view from the spectator seats and an actor’s viewpoint also. In addition, zoom-in and zoom-out with the pinch control and a switch to a full screen display of the stage area with long tap will become possible.

2.2 Performance Editing Area — Lighting

There are two lighting functions installed, lighting function 1 and lighting function 2. They are used separately according to the objective. The lighting function 1 performance editing panel will be displayed by tapping the “Light 1” button from the menu (Fig. 1 top-left). The same number of slide bars as light stands in a theater are installed in lighting function 1 and the intensity of the light stand beams changes according to the control of each bar. There are white, red, green and blue buttons on the left end of the performance control panel with which the user can change the light color. Since the information change will be reflected in the stage area, he/she uses this function when he/she wants to check the conditions of a stage due to a change in lighting. The lighting function 2 performance editing panel will be displayed by tapping the “Light 2” button from the menu (Fig. 1 top-right). Lighting function 2 selects the light stands that the user want to edit from a drop-down list and determines the intensity of the light by moving the displayed graph with a dragging motion. The horizontal axis on this graph is related to the script and the vertical axis indicates the light intensity.

2.3 Performance Editing Area — Sound

The sound performance editing panel will appear by tapping the “sound” button from the menu (Fig. 1 buttom-left). The user selects a song or a sound effect from the drop-down list and decides the volume by moving the displayed graph with a dragging motion. There are play, pause and stop buttons. The bar that displays the current playing position will flow from the right side when the user taps the play button and all the songs edited and set for the performance will play accordingly.

2.4 Performance Editing Area — Stage Set, Actors

The stage set performance editing panel will be displayed by tapping the “stage set” button from the menu (Fig. 1 buttom-right). The user can select the type of stage set, push the position button and a stage set will be generated in the central stage area. Currently, there are only three types of cubic stage settings. The user can also place an actor in the form of a person by selecting the actor button. The colors will change by tapping on stage sets produced in stage areas and actors and their positions can be freely changed by dragging.

2.5 Information Sharing Between Devices (Data Transmission)

The performance information entered into each device is shared real-time with other devices through a server using the Internet. With the production of a stage set by users and objects of actors, the information of each object is shared with other devices. The information received and sent between the devices are the positions of each object within a virtual space and the values of the size of objects. When all the devices are disconnected from the server, the information of each object stored in the server will be annulled.

Fig. 1.
figure 1

Screenshots of our application. The screen of our application consists of the left-side stage area and the right-side performance editing area. The panel on the performance editing area can be switched between Lighting 1 (top-left), Light 2 (top-right), Sound (buttom-left), and Stage Set (buttom-right). (Color figure online)

3 Experiment

3.1 Experimental Conditions

Evaluation experiments were carried out to verify the effectiveness of the application in actual public performances. We asked five participants (Table 1), who were in a drama circle at their university, to briefly address everyone on an actual stage. Ten days before the actual performance, we lent an Android tablet, which had the application we developed installed in it, to each participant. The participants had four one-hour meetings (10 days before the performance, 7 days before, 4 days before and one day before). We asked them to use the application as much as possible both during the meetings and at other times. After a performance, we had them answer in five levels from 1 to 5 in regard to the following functions.

  • Subject functions:

    • Lighting

    • Sound

    • Stage set

    • Actors

  • Question:

    • Q1. Did it become easier to picture your performance with the function?

    • Q2. Did the function help you in your performance plan?

    • Q3. Was the function easy to use?

Table 1. Overview of participants
Table 2. Experimental results

3.2 Experimental Results

The experimental results are listed in Table 2. For the actors function, they had a low rating (average of 2.8) for both Q1 and Q2. As a result, it was thought that the solution for Difficulty 1 was inadequate. The reasons given for this were that the movement of actors could not be recorded and their specific movements could not be fixed. Other parts, especially lighting and sound, had high ratings in Q1 and Q2 (although the stage set had a low score). From the open-ended questions, there were views that the application was useful, with people saying that they could easily check the lighting and sound. This implies that Difficulty 2 has been partly resolved. Q3 had a low rating overall. It was found that there was no problem with the application concept and function design itself but there was a need to review the usability of the packaging. From the participants, we obtained an opinion that the simulation of lighting and sound was useful to share the idea during the meeting. This implies that Difficulty 3 has also been partly resolved. However, the function of information sharing through the network sometimes did not work. We have to improve the implementation of this function.

4 Conclusions

An Android application was proposed in which the user can enter, check and share the lighting, sound and stage performance to support amateur theatre. It was confirmed from the experiment that the functions of the lighting and sound were effective but we realized that there was a particular problem with the user-friendliness of the function in entering the movement of the actors. We would like to do a much bigger-scaled experiment along to improve the user-friendliness in the future.