Abstract
In this paper we present smARt.assembly – a projection-based augmented reality (AR) assembly assistance system for industrial applications. Our system projects digital guidance information in terms of picking information and assembly data into the physical workspace of a user. By using projections, we eliminate the use of smart glasses that have drawbacks such as a limited field of view or low wearing comfort. With smARt.assembly, users are able to assemble products without previous knowledge and without any other assistance.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Current production systems are characterized by a very high degree of automation. However, a full automation of all production processes is not feasible. Changing markets cause product life cycles to become shorter and shorter. In addition, there is a shift from mass production to mass customization. Changing customer requirements result in mass customization where products have to be produced with a lot size of one [10]. These changes require a flexible production process where manual work will still be necessary. Workers at manual assembly stations (shown in Fig. 1) usually perform the manual tasks hand in hand with machines. Consequently, there will still be a place for humans during assembly in the production process of the future.
Embedding a manual assembly station into the production process results in high requirements for the worker. Due to mass customization, even two consecutive products can be completely different. This can result in highly demanding tasks for the worker, causing lower productivity and higher error rates. Due to a low availability of trained workers and concepts such as job rotation, training for the product assembly is required. However, training costs time and money, withdraws the worker from production, and is less applicable in case of ever changing products. An assistance system might also be used to support the worker at a manual assembly station, for example to train temporary workers in phases of high demand, in mass customization scenarios where exhaustive training of workers is difficult, for elderly people (whose quantity increases due to the demographic change [12]), or impaired workers that might need assistance during assembly due to various kinds of cognitive or motor skill impairments [13].
In previous research and industrial work, Augmented Reality (AR) assistance systems with smart glasses have been proposed as a way of training and assisting users, e.g. [5, 9, 17, 18, 20, 21]. However, systems with smart glasses often have drawbacks. First, the glasses currently available have limited wearing comfort due to the high weight of the electronic equipment, which makes them unsuitable for a workers’ eight-hour shift. Second, people that require optical aids would either need a special version of the smart glasses (resulting in very high expenses) or need to rely on contact lenses. Wearing smart glasses in combination with prescription glasses is usually not an option. Third, the field of view is often reduced when wearing smart glasses, as described in [9]. To overcome these issues, we created smARt.assembly – an AR system that guides users through the steps of a manual assembly process by projecting digital artifacts into the physical space. This system has been developed as part of the SmartFactoryOWL – a factory for research purposes that aims on evaluating concepts and ideas for future manufacturing facilities [7]. Having presented first ideas in [4], we now show how smARt.assembly could support users with manual assembly tasks in future, without requiring smart glasses and with minimal installation effort on existing assembly stations.
2 Related Work
A large body of research has been done on the broad field of AR (e.g. [2, 3, 8, 17, 20]). In this paper, we present the related work regarding two fields, which our work has been based on: projection-based (spatial) AR and AR assistance systems in industrial manufacturing.
2.1 Projection-Based Augmented Reality
Projection-based AR allows projecting digital artifacts on existing surfaces. The device itself can be much smaller than the area of the projection surface, which is an advantage in comparison with other approaches like normal displays – this even allows spanning whole rooms [11]. Today’s broad availability and low costs make them an interesting choice for AR systems. Besides the early work of Cruz-Neira et al. on projections for virtual reality [6], one of the first uses of projection-based AR was the Luminous Room by Underkoffler et al. [22]. They built a room where both projection and image capturing was possible for all surfaces. Based on the room, they developed multiple use case examples for projection-based AR. The general issue with projection-based AR is the projection onto heterogenic surfaces. To cover this issue, cameras can be used in combination with the projector to implement a feedback loop, as demonstrated in the work of Jones et al. [11]. The information gathered by a depth-camera is used to prepare the image before projection in such a way, that it is correctly displayed after projection. An alternative solution is to use multiple projectors which all have different focus planes [14]. However, this approach is limited in case of complex geometry requiring many different focus planes. In industrial applications, Otto et al. [16] have used projections to augment floor surfaces to visualize layouts of assembly stations. While being inspired by the mentioned work, we focus on the case of giving assistance to untrained workers using projection-based AR.
2.2 Augmented Reality Assistance Systems
In current industrial practice, assembly instructions are often provided on paper or by a computer assisted instruction (CAI) system.
Assembly Support. Among others, Tang et al. [21] have shown that an AR manual on smart glasses is superior compared to paper manuals or CAI systems in terms of required mental effort as well as in the quality in terms of a reduced error rate. Billinghurst et al. [2] present a system for assembly with the help of AR presentations on mobile devices. Similarly, Gorecky et al. [8] developed an AR assembly assistance that uses tablets to show the instructions. While the mentioned works focus on the assembly, in industrial applications the picking of the relevant parts is also part of the manufacturing process. There are assistance systems for picking of parts that can be grouped into the following two concepts: Pick-by-Light and Pick-by-Vision systems.
For Pick-by-Light systems, a light is attached to every slot in a rack. The light indicates which part to pick next, in some systems combined with a number that displays the amount of parts to be picked. Proximity sensors attached to each slot can trigger the next light [9]. Pick-by-Light systems require a physical installation on the manual assembly station and therefore make the system inflexible and static. While these systems provide a good and spatial feedback to the user, they are expensive and are only economically in small installations with a high turnover [1].
Pick-by-Vision systems usually use smart glasses to display a picking highlight in the users’ field of view. A build-in camera can be used for optical tracking to determine the users’ point of view. One advantage is that no installation to the rack is required and a broader area can be used with one device [9]. From the domain of warehouse order picking there has been work on AR support for picking, e.g. Weaver et al. [23] showed that picking with smart glasses is much faster than picking based on paper instructions or audio signals.
Combined Picking and Assembly Support. While previous work on AR assembly usually omitted the picking, pick-by-vision systems with assembly support have been presented. For example, Paelke [17] presents an AR system that not only focuses on the assembly but also includes the picking step in the AR application based on smart glasses. The drawbacks of this specific smart glasses approach are described in [18].
While all of the listed assistance systems use smart glasses, we focus on the use of projection-based AR for combined picking and assembly assistance. While previous work has shown that AR applications with smart glasses are very useful in the industrial context of manual assembly, we believe that presenting the same information as a projection is superior to smart glasses, since projection-based AR eliminates some of the drawbacks, such as a lack of comfort and limited field of view.
3 smARt.assembly
With smARt.assembly we show an AR-supported manual assembly station that uses projection-based AR to project virtual artifacts into the physical space. Two types of information are projected into the users’ workspace: picking information in terms of highlighting the box, where the next part is located and assembly information for each of the assembly steps.
The highlighting on the box is realized by multiple nested white rectangles that are animated and move towards the center of the related label of the boxes. The presentation was created in multiple prototype iterations and was chosen for two reasons: First, it was the most visible presentation on the multicolored box labels; second, the evaluation of earlier prototypes showed that the animations successfully catches the users’ attention, even if the box is not in their focus. The highlighting can be seen in Fig. 2(a).
The assembly steps are shown on a white panel on the right side of the assembly stations. With this projection surface of a size of 30 cm\(\,\times \,\)20 cm, each assembly step is shown individually. The presentation for one assembly step itself is a 2D image taken from a digital 3D model. The presentation shows the current work piece including the next part to mount, where the next part to mount is highlighted. Each of the 2D images is turned in a way that the new part to mount is clearly visible. The way of presenting the information was also developed over multiple iterations. Our evaluations of the first iterations indicated that the projection onto a surface located at the side of the station is superior to a presentation in the center, since assembly instructions do not overlap with the actual working space. The presentation can be seen in Fig. 2(b).
The projector for the installation is mounted above the head of the user (Fig. 3) and very close to the assembly station, so that information can be projected right onto the workstations without disturbing the user and without casting shadows from the user on the work area.
When using our current prototype, users have to indicate the next step manually to the system. For this purpose, we use foot pedals with two buttons: one for the next step and one for the previous step. With this approach we enable users to switch forward and backward, without having to move their hands away from the current work piece. We are currently investigating other input methods like gesture control or computer vision to improve the interaction with the system from an ergonomic perspective by eliminating the foot pedals.
To cope with changing products or changing production environments the system has to be flexible. A configuration mode allows to change the spatial location of the assembly instructions and to adapt the system to different box layouts. Therefore the system can easily be attached to existing racks or be adapted to changes in the setup.
4 Implementation
For the manual assembly station that is presented here we had to cover a projection area of approximately 100 cm\(\,\times \,\)60 cm, where the boxes are arranged in three rows, with either a size of 12 cm\(\,\times \,\)18 cm or 6 cm\(\,\times \,\)18 cm. It comes with a flat projection surface (see Fig. 3). However, a real world assembly station comes with a more complex arrangement to fit the users ergonomic needs, in the future we also want to support such configurations. Since the projector must not obstruct the users’ view on the assembly station, nor should the user cast shadows from the projector onto the assembly station, the projector had to be mounted very close to the rack. We used the Optoma GT670 projector that has a very small throw ratio of 0.52, a resolution of 1280\(\,\times \,\)800 (WXGA) and has keystone correction up to \(40^\circ \), which fitted our purpose very well. The main issues of projectors in an assembly assistance scenario are the brightness of the projected image and the lifetime of the devices. For our prototype we choose a traditional lamp based projector, however the lifetime of the lamp is limited, resulting in additional maintenance time and costs. Other technologies, such as LED or laser projectors, are better suited for long running scenarios and come with warranty for 24/7 operation. The projector is attached to a PC running Microsoft Windows 7 and the software was developed in C++ using the Metaio SDK [15].
5 Evaluation
In our ongoing study we focus on the usability of our presented assistance system for manual assembly and compare it with a previous prototype of an AR-based assistance system with smart glasses. The previous system has been presented in [19]. Within our study, participants are asked to assemble two out of three LEGO sets, which each consists of 18–19 bricks. The participants have to build each of the sets without prior knowledge and training. Each participant has to build one set with smARt.assembly and one set with smart glasses (within-subject-design). The build order is randomly selected as well as the LEGO set to be build is assigned randomly, always by making sure that no participant builds two identical sets. The time for the build process is monitored and errors in picking and in the build process are observed and logged. After the assembly of the two sets the participants fill out a survey asking the participants to classify statements about usefulness and joyfulness of the two approaches on a Likert-Scale. The participants were recruited at our university, resulting in eight male participants, aged 21–30 yr. (average 24.25 yr.), who all are students.
While the study is still ongoing, the results of the first eight participants are promising. The study clearly shows that the assembly time per step are lower for smARt.assembly compared with the previous system based on smart glasses. Participants in the study took between 5.4 s and 14.4 s (average: 9.3 s) with smARt.assembly, which is half the time compared with the previous prototype that resulted in a range between 13.9 s and 32.9 s (average: 24.4 s).
In the same way the results for the error rate was influenced by the technology used: while users had an average picking error rate of 10.1 % with smart glasses, with smARt.assembly not a single participants made a picking error. When it comes to the assembly process, the results were all free of errors when using smARt.assembly, while we observed one participant with a build error when using the smart glasses in the study. The study is ongoing and we will evaluate more participants to be able to prove that the results are significant. Futhermore, we will also present the results of the survey at a later point. For now, we can state that most of the participants showed excitement about this presentation of assembly instructions and expressed this exitement in the survey as well.
6 Conclusion
With smARt.assembly we presented a projection-based AR system that helps untrained workers to assemble products without any previous knowledge or trainings. The system shows both picking information for the next part to assemble as well as assembly instructions in the physical workspace of the user.
Even though a formal evaluation of the system is not part of this paper, we presented some first insights into our ongoing evaluation. We are currently continuing our user study that compares various presentation techniques, such as smart glasses, projection-based AR, and paper manuals in the context of industrial manufacturing. While doing this study, many people did already work with the smARt.assembly system presented in this paper and showed already promising results: Compared to our previous prototype with smart glasses [19], the projection-based system seem to be much more robust with respect to changing light conditions. The ongoing evaluation indicates that users working with the projection-based instructions are able to assemble products faster and with a lower error rate, compared to the system based on smart glasses. Furthermore, users working with smARt.assembly showed excitement about this presentation of assembly instructions. All users have been able to assemble the products without any previous knowledge about the assembly steps, just by following the guidance given by our system. We will continue our study to get more and significant results about the impact of different presentation techniques.
In addition to the ongoing evaluation, future work could contain an integration of other input modalities into the assembly station, such as gesture control or computer vision technologies for the detection of the obtained step as well as a mean for quality control.
All links were last followed on December 18, 2015.
References
Baumann, H.: Order picking supported by mobile computing. Ph.D. thesis, University of Bremen, January 2013. http://elib.suub.uni-bremen.de/edocs/00102979-1.pdf
Billinghurst, M., Hakkarainen, M., Woodward, C.: Augmented assembly using a mobile phone. In: Proceedings of the 7th International Conference on Mobile and Ubiquitous Multimedia MUM 2008, NY, USA, pp. 84–87 (2008). http://doi.acm.org/10.1145/1543137.1543153
Büttner, S., Cai, T., Cramer, H., Rost, M., Holmquist, L.E.: Using computer vision technologies to make the virtual visible. In: Mobile AR: Design Issues & Opportunities Workshop at MobileHCI 2011. ACM (2011)
Büttner, S., Sand, O., Röcker, C.: Extending the design space in industrial manufacturing through mobile projection. In: Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct MobileHCI 2015, NY, USA, pp. 1130–1133 (2015). http://doi.acm.org/10.1145/2786567.2794342
Caudell, T., Mizell, D.: Augmented reality: An application of heads-up display technology to manual manufacturing processes. In: Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences, vol. 2, pp. 659–669, January 1992
Cruz-Neira, C., Sandin, D.J., DeFanti, T.A.: Surround-screen projection-based virtual reality: the design and implementation of the cave. In: Proceedings of the 20th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 1993, pp. 135–142 (1993)
Fraunhofer, I.-I.: SmartFactoryOWL - Home. http://www.smartfactory-owl.de/index.php/en/. Accessed 2 June 2015
Gorecky, D., Campos, R., Chakravarthy, H., Dabelow, R., Schlick, J., Zühlke, D.: Mastering mass customization - a concept for advanced, human-centered assembly. Assembly Manufact. Eng. 11, 62 (2013)
Günthner, W.A., Blomeyer, N., Reif, R., Schedlbauer, M.: Pick-by-Vision: Augmented Reality unterstützte Kommissionierung. Lehrstuhl für Fördertechnik Materialfluss Logistik - Technische Universität München, Garching (2009). http://www.fml.mw.tum.de/fml/images/Publikationen/Abschlussbericht%20Pick-by-Vision.pdf
Hinrichsen, S., Jasperneite, J., Schrader, F., Lücke, B.: Versatile assembly systems - requirements, design principles and examples. In: Villmer, F.J., Padoano, E. (eds.) 4th International Conference on Production Engineering and Management PEM 2014, Lemgo, September 2014
Jones, B.R., Benko, H., Ofek, E., Wilson, A.D.: Illumiroom: Peripheral projected illusions for interactive experiences. In: ACM SIGGRAPH 2013 Emerging Technologies SIGGRAPH 2013, no. 7 (2013)
Korn, O., Funk, M., Schmidt, A.: Towards a gamification of industrial production: A comparative study in sheltered work environments. In: Proceedings of the 7th ACM SIGCHI Symposium on Engineering Interactive Computing Systems EICS 2015, NY, USA, pp. 84–93 (2015). http://doi.acm.org/10.1145/2774225.2774834
Korn, O., Schmidt, A., Hörz, T.: Augmented manufacturing: A study with impaired persons on assistive systems using in-situ projection. In: Proceedings of the 6th International Conference on PErvasive Technologies Related to Assistive Environments PETRA 2013, NY, USA, pp. 21:1–21:8 (2013). http://doi.acm.org/10.1145/2504335.2504356
Low, K., Welch, G., Lastra, A., Fuchs, H.: Life-sized projector-based dioramas. In: Proceedings of the ACM Symposium on Virtual Reality Software and Technology VRST 2001, NY, USA, pp. 93–101 (2001). http://doi.acm.org/10.1145/505008.505026
Metaio GmbH: meatio | SDK Overview. http://www.metaio.com/products/sdk/. Accessed 19 Mar 2015
Otto, M., Prieur, M., Rukzio, E.: Using scalable, interactive floor projection for production planning scenario. In: Proceedings of the Ninth ACM International Conference on Interactive Tabletops and Surfaces ITS 2014, NY, USA, pp. 363–368 (2014). http://doi.acm.org/10.1145/2669485.2669547
Paelke, V.: Augmented reality in the smart factory: Supporting workers in an industry 4.0. environment. In: Emerging Technology and Factory Automation (ETFA), pp. 1–4. IEEE, September 2014
Paelke, V., Röcker, C.: User interfaces for cyber-physical systems: Challenges and possible approaches. In: Proceedings of the International Conference on Human-Computer Interaction HCII 2015 (2015)
Paelke, V., Röcker, C., Koch, N., Flatt, H., Büttner, S.: User interfaces for cyber-physical systems: expanding the designer’s toolbox. at-Automatisierungstechnik 63(10), 833–843 (2015). Walter de Gruyter GmbH
Schreiber, W., Alt, T., Edelmann, M., Malzkorn-Edling, S.: Augmented reality for industrial applications - a new approach to increase productivity? In: Proceedings of the 6th International Scientific Conference on Work With Display Units WWDU 2002. pp. 380–381 (2002)
Tang, A., Owen, C., Biocca, F., Mou, W.: Comparative effectiveness of augmented reality in object assembly. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems CHI 2003, pp. 73–80 (2003)
Underkoffler, J., Ullmer, B., Ishii, H.: Emancipated pixels: Real-world graphics in the luminous room. In: Proceedings of the 26th Annual Conference on Computer Graphics and Interactive Techniques SIGGRAPH 1999, pp. 385–392. ACM Press/Addison-Wesley Publishing Co., New York (1999). http://dx.doi.org/10.1145/311535.311593
Weaver, K.A., Baumann, H., Starner, T., Iben, H., Lawo, M.: An empirical task analysis of warehouse order picking using head-mounted displays. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems CHI 2010, pp. 1695–1704 (2010)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Sand, O., Büttner, S., Paelke, V., Röcker, C. (2016). smARt.Assembly – Projection-Based Augmented Reality for Supporting Assembly Workers. In: Lackey, S., Shumaker, R. (eds) Virtual, Augmented and Mixed Reality. VAMR 2016. Lecture Notes in Computer Science(), vol 9740. Springer, Cham. https://doi.org/10.1007/978-3-319-39907-2_61
Download citation
DOI: https://doi.org/10.1007/978-3-319-39907-2_61
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-39906-5
Online ISBN: 978-3-319-39907-2
eBook Packages: Computer ScienceComputer Science (R0)