Abstract
The concept of smart factory is being applied into traditional manufacturing system. Since factory automation is applied gradationally, Human-robot collaboration (HRC) system is becoming an important issue. In order to construct an effective HRC system, clear communication with the human workers and the robots has to be considered. This research proposed a conceptual framework of process model based HRC system for efficient human-robot collaboration in a semi automation process to produce electric motors. We applied a process modeling methodology for capturing collaborative features, activity and resource flow in the manufacturing process. The model defined by the proposed methodology is the data storage to contain process information and interface between the human workers and robots to provide accurate information in the appropriate context. Furthermore, machine vision technology is implemented to recognize specifications of work in process (WIP) parts. The recognized parts are mapped with the correct work order and work instruction manual defined by the proposed model. In order to reduce human worker’s errors, the extracted work information is transmitted to the worker through the augmented reality device. The proposed HRC system is expected to be able to support the construction of a semi automation system that can reduce errors of human workers and ensure production flexibility.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
This research proposed a conceptual framework of human-robot collaboration (HRC) system for semi-automated process to produce electric motors. This process consists of automated machines, a handling robot, and a human worker. The automated machines conduct processing operations such as cut off, drilling, facing and etc. The robot assists the human worker in the material handling operation. The robot picks up the parts from each automated machine and locates on the pallets, then transfers the partial processed parts to the assembly section which performed by a human worker. In the current practice, there are no interaction between the robot and human worker on the part information. Human worker does not receive information of the type of parts that are picked up by the robot in advanced, i.e. before the part arrives at assembly section, especially when changeover of part model. Hence, delay occurred when the human worker needs to measure and identify the part.
In this paper, machine vision technology is applied to identify the part in collaboration process. When image processing through machine vision is completed, accurate work order and part information corresponding to the extracted image should be defined. We defined the work order and part information using Part-flow based Manufacturing Process Modeling (PMPM) which can express process flow and part flow at the same time [1]. In order to transmit the tracked data to the worker immediately, we used the augmented reality (AR) device to transmit work order information and product specification information. This study proposes a conceptual framework of HCR system using a process model and AR interface.
2 Literature Reviews
2.1 Human-Robot Collaboration in Assembly Operation
Human-robot collaboration is one of new trends in the field of industrial robots as a part of strategy Industry 4.0. Full automation has limitation in achieving the required production demand due to the high complexity in process and flexibility of equipment to cope with the rapid response. Hence, HRC which has advantage to maximize the flexibility in production became one of the important trends in current industrial application. Human-robot collaboration is a system in which a robot and a human are collaborating on a same task which combines benefits from both human and robot in a joint task [2]. HRC system plays an important role to increase flexibility to cope with wide variety of products and the frequent changeover in equipment to cope with the fluctuating demand [3]. In HRC workstation, the collaborative robot must have the capability to sense the existence of human, i.e. with appropriate safety mechanism that defines the ability to collaborate with human without safety cage.
Machine Vision for Assembly.
Machine vision system consists of image capture device, lighting and computer with image processing software [4]. Machine vision systems enable flexibility in automation by providing the machine capability of “see” and think. From the existing researches, the main application of machine vision system is quality monitoring for wide range of industries such as electromechanical parts [5], textile manufacturing [6], pharmaceutical [7], food packaging [8], steel making industry [9] and etc. Teck et al. [10] integrated machine vision system into automation systems to provide product data which assists the decision making of the production system. Gao et al. [11] developed machine vision to track and calibrate the coordinates of fast moving battery lid in automated sealing rings assembly system. The application of machine vision with the robot in the assembly system increased the efficiency and robot adaptability to the environmental changes and enable real-time adjustment of robot motion.
AR for Assembly.
AR is a technology derived from a field of VR, which refers to a computer graphics technique that superimposes virtual contents created in 3D in the real world of the user [12]. Since the augmented reality technology models and matches only the required virtual objects based on the real-time image of the actual manufacturing environment, the construction cost and time for 3D models necessary for implementing a new manufacturing system can be drastically reduced.
Kollatsch et al. [13] discussed about an AR based application for mobile devices realizing a user-friendly and problem-oriented visualization of information. However, this research has limitations in simply suggesting the AR-based concept introduced for visualizing the process value of mobile devices. Further, Wang et al. [14] presented about a novel human Cognition-based interactive Augmented Reality Assembly Guidance System is proposed to investigate how AR can provide various modalities of guidance to assembly operators for different phases of user cognition process during assembly tasks. In addition, Danielsson et al. [15] implemented an environment in which an untrained worker using AR in a human-robot collaboration environment. This research, however, has a problem that various errors occur during assembly due to misunderstandings of the instructions of the test person.
Although many studies have been conducted to utilize the strength of the AR technology in the manufacturing system implementation, there is a lack of an AR system which can be properly utilized in the manufacturing system.
2.2 Part-Flow Based Manufacturing Process Modeling: PMPM
PMPM is a modeling tool for visualizing and visualizing man (worker), machine, material (parts), and method (activity) of collaborative manufacturing process. It can clearly define the order of the manufacturing operations that make up the process and its execution objects. This method consists of 6 notations for activity expression and 10 notations for part expression. It can provide functions to record parts’ history and to manage parts status by using part notations. and this method can define collaborative activities in each process. The flow of parts and the flow of work can be determined identically or independently. It is possible to define changes in characteristics such as changes in the number of management units of parts and merging of parts. It also allows a clear definition of objects using manufacturing facilities. It is manufacturing process-oriented modeling methodology. In this paper, we used PMPM to store process information and track part information.
3 Process Model Based HRC System
3.1 System Framework
Figure 1 shows data flow diagram of the proposed process model based HRC system. Electronic motor manufacturing process consists of four automated workstations which perform coil forming and cap machining and assembly of electric motor by human worker. Two types of material handling methods are used in the process: (1) using conveyor to transfer parts between automated workstations; (2) a robot collects the partial complete components from these workstations on a pallet and transfers to assembly workstation. PMPM modeling methodology is used in this study to build the digital model for this process. The PMPM model stores information about work order, parts flow, and task performing objects. When the part arrives at the assembly process, the operator recognizes the part through the AR device. In this paper, the machine vision is integrated in AR interface to identify the parts in the assembly model. After the parts are identified, the work order such as assembly sequence, parts required and production quantity corresponds to the identified part is extracted from the PMPM model and transmitted to the AR interface. The PMPM modeling simulates the start time of each workstation in the order of minimizing the production cycle time. The coordination of processing time among the workstations and the material handling time by robot i.e. part pick-up time at each automated workstation and delivery time to human assembly workstation by robot is important to ensure the smooth flow and avoid stacking of part and idle time at each station especially the human assembly workstation. Hence, the real-time production status at the human assembly workstation has to be sent to the robot via AR interface to “call” for parts and “stop” input of parts. Besides, this method also can reduce overproduction at each automated workstation and part shortage with the updated finished parts quantity.
3.2 Implementation
PMPM Modeling.
The first step is to model the electronic motor assembly process using PMPM. Figure 2 shows the result for modeling the target process. Rectangles are the tasks performed by automated machines. Rounded rectangles and octagons are the collaborative works of the robot and the worker. As shown in Fig. 2, the type and number information of the input and output parts are defined in the model. Therefore, this model plays a role as an important data engine that has main information of HRC system and is used to track part information and verify image processing results. From PMPM modeling, the human worker who performs the assembly is updated with the real-time type of model and quantity during the production in each automated workstation via the AR device. Hence, PMPM modeling acts as communication interface between the automated workstation and human worker who performs the assembly.
Image Processing.
The electric motor models are differentiated according the size of case and the size of cap. The dimension of cap and case are shown in Table 1. Since the outer shapes of the parts are similar, there are possibility for human worker assemble the wrong combination. The measurement using Vernier caliper every time before the assembly process reduce the productivity of the operation. Hence, part measuring using machine vision system is proposed to increase the accuracy of part identification and increase the productivity of the assembly process. In order to achieve these, the vision system must be capable to acquire the image for part detection as close as human vision [16]. The steps in image processing involved to identify cap and measure the dimension of the cap arrived at the assembly section are shown in Fig. 3. First, the image is captured by the Augmented Reality interface and send to MATLAB image processing software. Image processing is performed using image processing toolbox in MATLAB. The image processing starts with converting the image acquired to gray scale image. Then, the image is converted to binary image for edge detection using Canny method. Since the cap is in round shape, the ‘imfindcircles’ function is used to identify the shape with defined radius range. Finally, the identified cap is marked and the radius is measured in pixels (see Fig. 4). The algorithm needs further improvement to remove the shadow of the image captured and to provide measurement in centimeters.
AR Interface Development.
In this study, app development environment is built. Based on Android, an app is developed that uses an image target and a virtual button which are linked to a tracker by camera rendering by using Vuforia and Unity. The developed application visualizes actual factory data and pre-analyzed simulation data to the manager and visualizes them to the human worker in real time through the AR device. Managers can make manufacturability decisions and verify production plans through pre-simulation. In addition, real-time log data preprocessing and data exchange environment can be constructed by data linkage between log data from manufacturing facilities and MES.
To build an AR interface, we used the Vuforia module by using the Unity engine. Unity is a virtual/augmented reality production tool which has the advantage of creating mobile AR programs regardless of iOS or Android. This is expected to be effective in the manufacturing field where various AR equipment are combined. The user can monitor real-time information of the identified part. The information is predefined and can be freely set according to the characteristics of the operator and each part.
Figure 5 shows the pilot program screen of the AR interface. It provides the assembly recipes and process information stored in the database to the operator using the AR through the part information verified by image processing. The developed AR application allows process values to be stored and analyzed to show process-related information to the operator or administrator. Thus, the user does not need extensive knowledge of the machine and the control device, and the production process is not disturbed. The collected data can be displayed to the user in real time or can be evaluated later. Process data is stored in the database and can be retrieved from other applications. The traditional monitoring system provides operators with a simple signal on the designated board, but the AR based interface provides the operator with real-time status of the plant and makes the work more efficient. The operator can virtually overlap the plant parameters and process state at the actual location of the sensor/equipment/actuator. It can also visualize information about areas that are generally inaccessible or dangerous.
4 Conclusion
In this paper, we proposed process model based HRC system using AR interface. We defined the work order, parts flow, and collaboration information of the electronic motor assembly process using the part-flow based manufacturing process modeling (PMPM). The PMPM model is used to act as an interface between human worker, automated processing machines and material handling robot through the identification of parts and corresponding work sequences. Hence, integration of AR devices and image processing is needed to capture parts information efficiently. Image processing technique used to extract features of component from the image captured using AR device and feedback the result of part identification to human workers via AR device. The proposed system is expected to reduce human error in part model identification and work sequence. Furthermore, production flexibility that able to cope with frequent model changeover can be achieved. The main contributions of this framework are the data management using process model and the use of image processing to increase the part recognition of AR. However, since this study is an early stage, it is necessary to verify the proposed framework and define practical problems through application.
References
Lee, H., Ryu, K., Son, Y.J., Cho, Y.: Capturing green information and mapping with mes functions for increasing manufacturing sustainability. Int. J. Precis. Eng. Manufact. 15(8), 1709–1716 (2014)
Faber, M., Bützler, J., Schlick, C.M.: Human-robot cooperation in future production systems: Analysis of requirements for designing an ergonomic work system. Procedia Manufact. 3, 510–517 (2015)
Thomas, C., Matthias, B., Kuhlenkötter, B.: Human-robot collaboration-new applications in industrial robotics. In: International Conference on Competitive Manufacturing (COMA), pp. 293–299, Stellenbosch (2016)
Malamas, E.N., Petrakis, E.G., Zervakis, M., Petit, L., Legat, J.D.: A survey on industrial vision systems, applications and tools. Image Vis. Comput. 21(2), 171–188 (2003)
Di Leo, G., Liguori, C., Pietrosanto, A., Sommella, P.: A vision system for the online quality monitoring of industrial manufacturing. Opt. Lasers Eng. 89, 162–168 (2017)
Cho, C.S., Chung, B.M., Park, M.J.: Development of real-time vision-based fabric inspection system. IEEE Trans. Industr. Electron. 52(4), 1073–1079 (2005)
Možina, M., Tomaževič, D., Pernuš, F., Likar, B.: Automated visual inspection of imprint quality of pharmaceutical tablets. Mach. Vis. Appl. 24(1), 63–73 (2013)
Duan, F., Wang, Y., Liu, H.: A real-time machine vision system for bottle finish inspection. In: ICARCV 2004 8th Control, Automation, Robotics and Vision Conference, vol. 2, pp. 842–846. IEEE, Kunming, China (2004)
Yun, J.P., Choi, D.C., Jeon, Y.J., Park, C., Kim, S.W.: Defect inspection system for steel wire rods produced by hot rolling process. Int. J. Adv. Manuf. Technol. 70(9–12), 1625–1634 (2014)
Teck, L.W., Sulaiman, M., Shah, H.N.M., Omar, R.: Implementation of shape-based matching vision system in flexible manufacturing system. J. Eng. Sci. Technol. Rev. 3(1), 128–135 (2010)
Gao, M., Li, X., He, Z., Yang, Y.: An automatic assembling system for sealing rings based on machine vision. J. Sens. 2017, 12 p. (2017). Article ID 4207432. https://doi.org/10.1155/2017/4207432
Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Julier, S., MacIntyre, B.: Recent advances in augmented reality. IEEE Comput. Graphics Appl. 21(6), 34–47 (2001)
Kollatsch, C., Schumann, M., Klimant, P., Wittstock, V., Putz, M.: Mobile augmented reality based monitoring of assembly lines. Procedia CIRP 23, 246–251 (2014)
Wang, X., Ong, S.K., Nee, A.Y.C.: Multi-modal augmented-reality assembly guidance based on bare-hand interface. Adv. Eng. Inform. 30(3), 406–421 (2016)
Danielsson, O., Syberfeldt, A., Brewster, R., Wang, L.: Assessing instructions in augmented reality for human-robot collaborative assembly by using demonstrators. Procedia CIRP 63, 89–94 (2017)
Peña-Cabrera, M., Lopez-Juarez, I., Rios-Cabrera, R., Corona-Castuera, J.: Machine vision approach for robotic assembly. Assem. Autom. 25(3), 204–216 (2005)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 IFIP International Federation for Information Processing
About this paper
Cite this paper
Lee, H., Liau, Y., Kim, S., Ryu, K. (2018). A Framework for Process Model Based Human-Robot Collaboration System Using Augmented Reality. In: Moon, I., Lee, G., Park, J., Kiritsis, D., von Cieminski, G. (eds) Advances in Production Management Systems. Smart Manufacturing for Industry 4.0. APMS 2018. IFIP Advances in Information and Communication Technology, vol 536. Springer, Cham. https://doi.org/10.1007/978-3-319-99707-0_60
Download citation
DOI: https://doi.org/10.1007/978-3-319-99707-0_60
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-99706-3
Online ISBN: 978-3-319-99707-0
eBook Packages: Computer ScienceComputer Science (R0)