Social Robot Sypehul Bolabot
Social Robot Sypehul Bolabot
Social Robot Sypehul Bolabot
Abstract—This paper discusses the development of Social (LBPH) Face Recognizer Method based on OpenCV Library
Robot named SyPEHUL (System of Physic, Electronic, Humanoid and Python 2.7. Finally, the face recognition system will be
Robot and Machine Learning) which can recognize and tracking implemented to 12 Degree of Freedom (DoF) Social Robot
human face. Face recognition and tracking process use Cascade
Classification and LBPH (Local Binary Pattern Histogram) Face named SyPEHUL to recognize and tracking human face based
Recognizer method based on OpenCV library and Python 2.7. on Arduino microcontroller for Human-Robot Interaction.
The social robot hardware based on Arduino microcontroller The paper is organized as follows. In section 2, described the
contains by 12 DoF (Degree of Freedom) motor servos to actuate theoretical background of Face Detection and Face Recogni-
robotic head and its face. The face recognition system has been tion in details. In section 3, describe the experimental method
implemented to Social Robot which can recognize and tracking
human face and then mentioned the person name. The face of this research. In section 4 described the results and the
recognition system of Social Robot result shows a good accuracy implementation of Face Recognition and Tracking to Social
for Human-Robot Interaction. Robot in detail. Finally, in Section 5 the concluding remarks
Index Terms—Face Recognition, Social Robot, Arduino, Python are given.
2.7, Human-Robot Interaction, SyPEHUL.
II. T HEORETICAL BACKGROUND
I. I NTRODUCTION A. Face Detection using Viola-Jones Method
Face recognition is an image processing method to locate Paul Viola and Michael Jones have published the face
the human face which need camera to capture the image of detection method which usually referred as the Viola-Jones
human face. The image processing will search the important method or only Viola-Jones in 2001 [2]. This method can
feature of a human face on the image, thus another object detect a human face in an Image by four key concepts:
will be ignored [1]. The image processing locates human 1) Haar features: Haar feature uses to find out the existence
face used by various algorithm and method, for example: of human face on captured image [24]. This feature used
AdaBoost [2], Viola-Jones method [3] [4] [5], Roberts Cross to detect the bright side and the dark side of the captured
method [6], and other [7] [6]. For the classifier to recognize image [25]. The existence of Haar features is determined by
a human face can be use; Local Binary Pattern(LBP) [8], subtracting the average of the dark-region pixels by the average
Hidden Markov Model(HMM) [9], Bayesian [10], Support of the pixel light-region. The calculation forms a rectangle
Vector Machine(SVM) [11], and other [12] [13]. shown at Fig. 1 around the detected face shown at Fig. 2. If
In this decade, face recognition can be implemented to many the difference is above the threshold, then the Haar features
projects are; social robot [14] [15], attendance automation [16], are said to be ”exist”.
home security [17] [18] [19], game [5], and other. Especially
for Social Robot project, face recognition can be increase
the Robot’s ability for Human-Robot Interaction. Because,
Social Robot is a future technology which can communicated,
entertaining or help human’s works. The Social Robot which
has build by researchers, for example; KISMET [14], Eddie
[20], Flobi [21], Muecas [22], Probo [23], and other [15]. (a) (b) (c)
In this paper will be described a development of Social Fig. 1. Haar rectangular feature [2]: (a)Edge, (b)Line, (c)Four rectangle.
Robot named SyPEHUL (System of Physic, Electronic, Hu-
manoid Robot and Machine Learning) which can recognize 2) Integral Image: The integral image used to speed up
and track human face. The real-time face detection and the feature detection in a way increase the pixel values from
recognize human face uses Cascade Classification method of original image. The integrated value already represents the
(Viola-Jones method) and Local Binary Pattern Histogram sum of all pixels above the threshold and is to the left of the
316
2017 2nd International Conferences on Information Technology, Information Systems and Electrical Engineering (ICITISEE)
317
2017 2nd International Conferences on Information Technology, Information Systems and Electrical Engineering (ICITISEE)
TABLE I
T HE FACE RECOGNITION ACCURACY RATE OF TRAINED RESPONDENTS IN
PERCENT (%).
No Name Accuracy(%)
1 Dyah 80
2 Mada 100
3 Rizki 80
4 Fikri 100
5 Ikhsan 80
6 Jaka 100
7 Awin 100
8 Tiara 100
9 Indry 100
10 Lutfi 100
11 Ratih 80
318
2017 2nd International Conferences on Information Technology, Information Systems and Electrical Engineering (ICITISEE)
on vertical, and the units in pixel. From the equation can get
the value to control the neck movement which divided into
4 movements, are; Up, Down, Right, and Left, illustrated on
Fig. 12.
Fig. 12. WebCam pixel coordinate and angle of motor servo. Fig. 13. The examination of face recognition and tracking for Human-Robot
interaction.
To make the Social Robot Head can following human
face, from the value result of center face coordinate must
be corresponding with the angle of the motor servo. As we V. C ONCLUSION
know, the Social Robot Head must be set the minimum and
maximum value of angle to control the Social Robot head In this research has been presented a development of Social
which can be looking up/down (vertically) or looking right/left Robot named SyPEHUL which can recognize and track the
(horizontally). In this research, the motor servo setup is 90o to human face. The face recognition processed by the algorithm
150o for horizontal angle arrange, and 20o to 70o for vertical based on Python 2.7 (with the OpenCV library) by using
angle arrange. The servos angle setup is written and upload Cascade Classification and LBPH Face Recognizer method has
into Arduino Board to control Social Robot SyPEHUL’s head. a good accuracy rate (92.73%). Also, the implementation of
Fig. 12 describe the information of image pixel, direction, facial tracking to control 12 DoF of Social Robot SyPEHUL
and servos angle of Social Robot Head. The concept to make based on Arduino microcontroller works effectively to control
the Social Robot head is moving; WebCam has captured the Robot’s head. The future works will focus on the combination
human face. After that, the algorithm will get the center of speech recognition and facial expression to enhance the
coordinate value which will process on Python. From Python, emotional expression of SyPEHUL for Human-Robot Interac-
the information will be sent to Arduino IDLE to control the tion.
Social Robot head to following human face.
C. The System Implementation to Social Robot R EFERENCES
After the Face Recognition and Tracking system are success-
[1] M. S. Kalas, “Real Time Face Detection and Tracking Using OpenCV,”
ful, the system can be applied to control the Social Robot Head International Journal of Soft Computing and Artificial Intelligence,
SyPEHUL. The result of the examination shown on Fig. 13. vol. 2, no. 1, pp. 41–44, 2014.
The social robot head can following the human face properly, [2] P. Viola and M. J. Jones, “Robust Real-time Object Detection,” Cam-
bridge Research Laboratory The, Cambridge, Massachusetts, Tech. Rep.
and can recognize the human face. The implementation of February, 2001.
Face Recognition has been successful, in addition, the Social [3] S. Tikoo and N. Malik, “Detection of Face using Viola Jones and
Robot can be mentioned the human name. The result of Recognition using Back Propagation Neural Network,” International
Journal of Computer Science and Mobile Computing, vol. 5, no. 5, pp.
face recognition and tracking can be apply for Human-Robot 288–295, 2016.
Interaction. [4] R. Boda and M. J. P. Priyadarsini, “Face Detection and Tracking Using
KLT and Viola Jones,” ARPN Journal of Engineering and Applied
Sciences, vol. 11, no. 23, pp. 13 472–13 476, 2016.
[5] C. Zhan, W. Li, P. Ogunbona, and F. Safaei, “A Real-Time Facial Ex-
pression Recognition System for Online Games,” International Journal
of Computer Games Technology, vol. 2008, pp. 1–7, 2008.
[6] S. Das, “Comparison of Various Edge Detection Technique,” Inter-
national Journal of Signal Processing, Image Processing and Pattern
Recognition, vol. 9, no. 2, pp. 143–158, 2016.
[7] H. C. V. Lakshmi and S. PatilKulakarni, “Segmentation Algorithm for
Multiple Face Detection in Color Images with Skin Tone Regions using
Color Spaces and Edge Detection Techniques,” International Journal of
Computer Theory and Engineering,, vol. 2, no. 4, pp. 552–558, 2010.
319
2017 2nd International Conferences on Information Technology, Information Systems and Electrical Engineering (ICITISEE)
320