Nothing Special   »   [go: up one dir, main page]

Robotics 11 00090

Download as pdf or txt
Download as pdf or txt
You are on page 1of 19

robotics

Article
Augmented and Virtual Reality Experiences for Learning
Robotics and Training Integrative Thinking Skills
Igor Verner 1, *, Dan Cuperman 1 , Huberth Perez-Villalobos 1 , Alex Polishuk 1 and Sergei Gamer 2

1 Faculty of Education in Science and Technology, Technion-Israel Institute of Technology, Haifa 3200003, Israel
2 Faculty of Industrial Engineering and Management, Technion-Israel Institute of Technology,
Haifa 3200003, Israel
* Correspondence: ttrigor@technion.ac.il; Tel.: +972-545480092

Abstract: Learning through augmented reality (AR) and virtual reality (VR) experiences has become
a valuable approach in modern robotics education. This study evaluated this approach and investi-
gated how 99 first-year industrial engineering students explored robot systems through such online
experiences while staying at home. The objective was to examine learning in the AR/VR environment
and evaluate its contribution to understanding the robot systems and to fostering integrative thinking.
During the AR experiences that we developed using Vuforia Studio, the students learned about
TurtleBot2 and RACECAR MN robots while disassembling and modifying their models and by
obtaining information about their components. In the VR experience with the RacecarSim simulator,
the students explored sensor-based robot navigation. Quizzes were used to assess understanding of
robot systems, and a post-workshop questionnaire evaluated the workshop’s contribution to learning
about the robots and to training integrative thinking skills. The data indicate that the students gained
understanding of the robot systems, appreciated the contribution of the augmented and virtual reality
apps, and widely used integrative thinking throughout the practice. Our study shows that AR apps
Citation: Verner, I.; Cuperman, D.; and virtual simulators can be effectively used for experiential learning about robot systems in online
Perez-Villalobos, H.; Polishuk, A.; courses. However, these experiences cannot replace practice with real robots.
Gamer, S. Augmented and Virtual
Reality Experiences for Learning Keywords: robotics education; first-year engineering students; online learning; augmented reality;
Robotics and Training Integrative virtual simulation; integrative thinking
Thinking Skills. Robotics 2022, 11, 90.
https://doi.org/10.3390/
robotics11050090

Academic Editor: 1. Introduction


Thierry Chaminade In educational literature, there is a lively debate on utilizing the concepts of virtual
reality (VR) and augmented reality (AR) environments in which learners interact with
Received: 25 July 2022
Accepted: 1 September 2022
virtual objects as part of their learning process [1]. In VR, learning concentrates on inter-
Published: 6 September 2022
action only with virtual objects [2,3]. In AR, virtual objects are superimposed on the real
environment, and learners interact with real and virtual objects, while these two kinds
Publisher’s Note: MDPI stays neutral of objects do not interact among themselves [4]. In mixed reality (MR), learning includes
with regard to jurisdictional claims in
interactions among real and virtual objects [5]. In this paper, we will focus on learning in
published maps and institutional affil-
AR and VR environments.
iations.

1.1. Augmented Reality in Education


Augmented reality enhances traditional education and provides several advantages
Copyright: © 2022 by the authors.
to the learning process. Ref. [6] performed a meta-analysis on several studies from 2007
Licensee MDPI, Basel, Switzerland. to 2017 for determining the effectiveness of AR applications for the learning process with
This article is an open access article different variables. The findings of their study indicated that learning with AR positively
distributed under the terms and impacts learning outcomes. The mentioned educational benefits and added values of AR
conditions of the Creative Commons were more attractive learning environments, improved subject comprehension, and making
Attribution (CC BY) license (https:// abstract concepts more concrete, leading to better understanding, recall, and concentration.
creativecommons.org/licenses/by/ Regarding the implementation of AR in instructional laboratories, Ref. [7] studied
4.0/). the effects of using AR environments on learning and cognitive load in university physics

Robotics 2022, 11, 90. https://doi.org/10.3390/robotics11050090 https://www.mdpi.com/journal/robotics


Robotics 2022, 11, 90 2 of 19

laboratory courses, and Ref. [8] carried out a similar study in an electrical circuits course.
In both studies, the courses were taught to two groups of undergraduate students, one
with a traditional lab-work format and the other with AR-assisted lab-work. Results of
the post-course knowledge tests in the first study did not reveal significant differences
in learning outcomes of the groups, while in the second study they indicated that AR-
supported learning led to higher conceptual knowledge gains than non-AR learning. The
post-course questionnaires in both studies showed that the cognitive load of the students
studying in the AR-assisted lab was significantly lower than that in the traditional lab.

1.2. Augmented Reality in Engineering Education and Robotics


Learning activities in augmented reality are effectively used in engineering education.
For example, in the study [9], 60 first-year undergraduate students majoring in electronics
and electrical engineering participated in a laboratory course and learned to use oscilloscope
and function generator. The students were divided into an experimental group and a control
group. The experimental group practiced operating the equipment in the augmented
reality environment and then operated the actual equipment, while the control group was
taught with the traditional approach using the laboratory manual. Results of the study
indicated significant advantages of the experimental group over the control group in terms
of operation skills, lower cognitive load, and higher appreciation of the learning experience.
Recent studies considered the use of augmented reality environments for learning mo-
bile robots. The study [10] presents an experiment in which the students developed a mobile
robot controlled by the GoGo Board and operated it in augmented reality using the data
from the virtual sensors, supplied through the HoloLens app. Participants of the study were
36 university students divided into the experimental and control groups who practiced in
operating robots to perform maze navigation tasks. For students of the experimental group,
the sensor data were presented through AR, while for the control group they were displayed
on the computer screen. The study found that students’ achievements in learning robot
operation in the experimental group were significantly higher than in the control group.
The study [11] considers a lab exercise in which 20 undergraduate students practiced
in path planning and navigation of the Khepera II mobile robot using information supplied
through AR. The participants appreciated the acquired experience in using AR and the
contribution of the AR tools to learning the subject.
While demonstrating that AR apps can be used in robotics education, the discussed
case studies did not consider the pedagogical aspects of learning the subject in AR environ-
ments. In the study presented in this paper, we focused on these aspects and developed
and implemented AR experiences as an online, remotely accessible practice in explor-
ing modern robot systems. The AR experiences were applied not only to help students
learn concepts, but also to foster the development of integrative thinking skills for solv-
ing problems in robot construction, programming, and operation. Modern intelligent
robots are complex integrated systems of components that implement different disruptive
technologies. Learning how to develop and operate such robots requires the students to
make connections among the concepts and develop a holistic view of the robot system,
i.e., apply integrative thinking [12].
In the study presented in this paper, we developed, implemented, and evaluated a
workshop “Mobile Robots in Augmented Reality”, in which the students explored two
different mobile robots: TurtleBot2 of Yujin Robotics [13] and RACECAR MN developed
at MIT [14]. The students, while at home, practiced with the robots in augmented and
virtual reality environments. This setting for the workshop practice was chosen due to the
social distancing restrictions caused by the pandemic. In the workshop, the students used
their own mobile devices to “place” virtual robots on top of real tables at their homes. In
the AR experiences, the students investigated the architecture and the components of two
different robots, while in the VR experiences, they used the RacecarSim simulator to explore
the sensor-based navigation of the RACECAR. The evaluation of the workshop outcomes
Robotics 2022, 11, 90 3 of 19

focused on the contribution of the experiences to understanding the robot systems and
fostering the students to apply integrative thinking.

1.3. Learning with Understanding


Learning with understanding (LWU) is characterized by making connections among
the learned concepts and prior knowledge [15]. LWU includes the ability to use the knowl-
edge to solve practical problems [16]. Ausubel et al. [17] pointed out that LWU can occur
when the learning material is understandable and the learners have prior knowledge
needed to understand the subject, are willing to make an effort to understand it, and are
scaffolded to apply appropriate learning strategies. De Weck et al. [18] (p. 168) empha-
sized that solving real problems related to a complex engineering system requires deep
understanding of the system’s structure, functionality, and principles of its operation. Un-
derstanding an engineering system requires a careful assessment of the system’s scale and
scope, its function, its structure (or architecture), its dynamics, and changes over time.
Shen and Zhu [19] noted that complex engineering systems encompass concepts
from different fields and that scaffolding needed to facilitate learning of such systems
should take forms adequate to these concepts. The authors proposed to provide scaffolding
for learning concepts of complex systems in physical and virtual environments in the
forms of presentation, conceptualization, and problem-solving. Presentation scaffolding
helps students to understand the learned concepts by using simulations, animations,
and other forms of tangible illustration. Conceptual scaffolding contributes to student
understanding by providing conscious explanations about the concepts that underlie the
learned system. Problem-solving scaffolding enables the students to deeper understand
a system by involving them in solving practical problems related to its architecture and
functionality. In this study, we implemented all three scaffolding techniques to assist
student learning of complex robot systems.

1.4. Integrative Thinking


Researchers from different fields interpreted the meaning of integrative thinking (IT)
differently, depending on the context of their studies [20,21]. Martin [22] (p. 13) studied IT
in the area of management and defined it as an ability to resolve tensions among opposing
ideas and generate a creative solution in the form of a new superior idea. Tynjälä [23]
considered the concept of IT in adult education and perceived it as the ability to synthesize
different ideas and perspectives, not obligatory opposing.
In engineering, integrative thinking skills are vital when dealing with complex tech-
nological systems [24]. Engineers apply IT when they analyze systems to comprehend
their architecture and functionality and develop new integrated systems. In robotics, the
robot operating system (ROS) is an excellent example of how integrative thinking is being
applied. ROS allows robot systems to be extended and different robots to be integrated
into multi-robot systems [25].
Although IT is crucial for engineering, we could not find a definition of this ability
in the engineering education literature. Some authors, e.g., [26], used Martin’s definition
even though it does not account for the specific characteristics of integrative thinking when
applied to engineering systems. As we explored ways to develop students’ IT skills through
practice with robot systems, we found it necessary to define this ability within the context
of engineering systems. The following is a working definition of IT that we propose:
Integrative thinking on engineering systems is the cognitive process of considering the systems
as structures of devices that by working in coordination provide a system-level functionality beyond
the functions of the individual devices.
When describing the ability of integrative thinking, educators distinguish between
verbal integrative thinking skills and visual integrative thinking skills [27]. With regard to
engineering systems, using visual integrative thinking skills allows to combine images of
the system’s parts and workspace and create its comprehensive view needed for solving
problems related to the system design and operation. Using the skills of verbal integrative
Robotics 2022, 11, 90 4 of 19

thinking allows to comprehend and analyze different verbal information on a specific


aspect of the system and draw inferences needed for solving system engineering problems
related to this aspect.
Engineering education asserts that integrative thinking skills can be developed through
learning practice and that engineering curricula should foster development of these
skills [28–30]. The experiential learning of engineering systems, and particularly that
of robot systems, offers students rich opportunities to develop their IT skills, make con-
nections between different disciplines, and integrate theoretical knowledge and practical
skills. Team projects in robotics bring together students with different backgrounds, who
apply integrative thinking when collectively performing their project tasks [31]. When
the students analyze a robot system or create a new one from components, they need to
comprehend the functionality and affordances of each component, and how it interacts
with other components, physically or digitally. When programming and operating robots
in physical and virtual environments, the students should perceive the affordances of robot
operation and monitor its behaviors in the workspace. In these intellectual activities of
comprehension, perception, and monitoring, the students can vigorously apply and train
their integrative thinking skills.
Educators point out that engineering programs rarely pay explicit attention to fostering
and assessing students’ integrative thinking skills [32]. We did not find studies addressing
this subject in the educational robotics literature.

2. Materials and Methods


2.1. Research Goal and Questions
The study was conducted in the 2020–2021 academic year in the framework of the
workshop “Mobile Robots in Augmented Reality” delivered to first-year students of the
Technion Faculty of Industrial Engineering and Management (IEM). The goal of the study
was to examine how the students learned through online practice with robot models in
virtual and augmented realities and how they evaluated the contribution of this practice to
training their integrative thinking. The research questions were:
1. Did the students, involved in the online augmented and virtual reality experiences
with robot systems, develop an understanding of the systems?
2. How did the students evaluate the contribution of augmented and virtual reality
experiences to training integrative thinking skills?
In this study, we developed augmented reality experiences, conducted the workshop based
on these and previously developed experiences, and evaluated students’ learning outcomes.

2.2. Method
This research is an exploratory pilot study which evaluates the feasibility of involving
first-year students majoring in industrial engineering in online experiential learning of
robot systems outside the robotics lab, through augmented and virtual reality experiences.
Because of the limited access to the students caused by the epidemic and course restrictions,
the study did not have a control group practicing in parallel to the experimental group
with the same robotic systems, not using AR and VR. To increase the validity of the
research findings, we compared the learning outcomes of the experimental group in the
current workshop with the relevant outcomes of our previous workshops that are presented
below in Section 2.3.
The study used the mixed-method design that employs the qualitative and quantitative
approaches to data collection and analysis of the learning outcomes. According to [33]
the outcomes can be divided into cognitive, skill-based, and affective. In our study, the
evaluation of cognitive outcomes focused on understanding the robot systems and on
how the learning activities engaged the students in applying integrative thinking skills.
Evaluation of technical skills concentrated on the ability to explore and operate the robots
in AR and VR environments. For affective attitudinal outcomes considered students
reflections on their practice with Industry 4.0 technologies.
Robotics 2022, 11, 90 5 of 19

2.3. The Workshop Intentions


In the last ten years, we have been conducting robotics workshops for the IEM’s
first-year students, with the goal of increasing their awareness about industrial engineering,
enabling them to experience robot systems for the first time, and developing their generic
skills required for engineering studies and practice. Over these years, the workshop’s
technological environment and learning activities have undergone significant upgrades.
The first version of the workshop was held in the faculty robotics lab. The students prac-
ticed operating Scorbot robot-manipulators in physical and virtual environments to assemble
block structures. The assembling tasks were oriented towards developing spatial skills [34].
The second version of the workshop was developed after the laboratory was equipped
with an advanced industrial robot Baxter. In the workshop, the students, while in the
faculty computer class, practiced operating the robots Scorbot ER5 and Baxter using the
RoboCell and Gazebo simulators. The tasks focused on exploring robot manipulation
affordances [35]. The third version of the workshop was developed in order to provide
practical training to students when the faculty robotics laboratory and computer class
were closed because of pandemic restrictions, and classes were conducted only remotely.
Therefore, our intentions in designing the workshop evolved and were as follows:
• Provide student practice with modern robots using the online environment that we
developed based on the AR and VR technologies.
• Facilitate learning of a complex robot system, in which students discover the principles
of operation of its components and do not take them as black boxes.
• Offer opportunities for training integrative thinking skills through practice with the
robot systems.
• Help students to understand the essence of the technological transformation brought
by the Fourth Industrial Revolution and the learning opportunities it brings.
• Test a possible implementation of the above intentions in a short-term online workshop.

2.4. AR Experience with RACECAR MN


In our past study [31] we developed an AR application for studying the TurtleBot2
robot system. Based on this experience, in the current study, we developed a new AR
application that allows students to practice with a more complex and advanced RACECAR
MN robot. The Rapid Autonomous Complex Environment Competing Ackermann steering
Robot Model Nano (RACECAR MN) is an autonomous mini racecar (Figure 1) designed
Robotics 2022, 11, x FOR PEER REVIEW
by the MIT Beaver Works Summer Institute (BWSI) program for teaching students 6 ofthe
20
concepts and technologies of self-driving cars [36]. The rationale for developing the new
AR application was to support experiential learning about RACECAR MN even when the
stages, in
physical which
robot weaccessible.
is not created the
The3D
ARmodel, the animations,
application andinthe
was developed experience,
three as de-
stages, in which
scribed below.
we created the 3D model, the animations, and the experience, as described below.

Figure 1.
Figure 1. The
The RACECAR
RACECAR MN.
MN.

2.4.1. Creating a 3D Model of the RACECAR MN


The created model presented the main subsystems of the robot: a Jetson Nano
onboard computer; an Intel RealSense color and depth camera with an integrated Inertial
Measurement Unit (IMU); a laser imaging, detection, and ranging device (Lidar); and a
Robotics 2022, 11, 90 6 of 19

Figure 1. The RACECAR MN.

2.4.1.
2.4.1.Creating
Creatinga a3D
3DModel
Modelofofthe
theRACECAR
RACECARMN
MN
The
Thecreated
createdmodel
modelpresented
presented the main subsystems
the main subsystems ofof the
therobot:
robot:a aJetson
JetsonNano
Nano on-
onboard computer; an
board computer; an Intel
Intel RealSense
RealSense color
color and
and depth
depthcamera
camerawith
withan anintegrated
integratedInertial
Inertial
Measurement
MeasurementUnit Unit(IMU);
(IMU);aa laser
laser imaging, detection, and
imaging, detection, andranging
rangingdevice
device(Lidar);
(Lidar);and
anda car
a
car chassis,
chassis, including
including allall
thethe components
components forfor driving
driving and
and steering
steering the
the car.
car. WeWe developed
developed the
the model
model of of
thethe robot
robot chassis
chassis (Figure
(Figure 2a) using
2a) using the SolidWorks
the SolidWorks software
software and integrated
and integrated it with
it with the computer-aided design (CAD) models of other robot components provided MIT
the computer-aided design (CAD) models of other robot components provided by the by
the MITinto
BWSI BWSItheinto the holistic
holistic model of model of thesystem
the robot robot system presented
presented in Figurein 2b.
Figure 2b.

(a) (b)
Figure 2.2.
Figure The robot’s
The CAD
robot’s models:
CAD (a)(a)
models: The chassis
The CAD
chassis model;
CAD (b)
model; RACECAR
(b) MN
RACECAR virtual
MN virtualmodel.
model.

2.4.2.Creating
2.4.2. Creatingthe
theAnimations
Animations
AtAtthe
the next
next step
step of the
of the AR app
AR app development,
development, Creo Illustrate
Creo Illustrate (www.ptc.com/en/
(www.ptc.com/en/prod-
products/creo/illustrate, accessed on 30 August 2022) was used to create
ucts/creo/illustrate, accessed on 30 August 2022) was used to create animations animations
of the ro-of
the robot model. The animations presented how to break down RACECAR
bot model. The animations presented how to break down RACECAR MN into subsystems MN into
subsystems and break down the subsystems into components, while providing
and break down the subsystems into components, while providing information about information
about them on the attached labels.
them on the attached labels.
2.4.3. Creating the AR Experience
2.4.3. Creating the AR Experience
At the final stage of the AR app development, Vuforia Studio 8.3.0 (ptc.com/en/
At the final stage of the AR app development,
products/Vuforia/Vuforia-studio, accessed on 30 Vuforia
August Studio
2022)8.3.0
was(ptc.com/en/prod-
used to upload the
ucts/Vuforia/Vuforia-studio, accessed on 30 August 2022) was
model and animation to the cloud and create the graphical user interfaceused to upload
(GUI)thefor model
interact-
and
Robotics 2022, 11, x FOR PEER REVIEW animation to the cloud and create the graphical user interface (GUI)
ing with the experience. The GUI provides the main view of the entire robot and separatefor interacting
7 of 20
with the experience. The GUI provides the main view of the entire robot and
views of each of the three robot layers. In the main view, the animation enabled the user to separate
views
manageof each of the three robot
the presentation of thelayers. In the main
tags attached to theview, the animation
components of the enabled the user
virtual robot. The
touser
manage
couldthe presentation
choose to of
display the
only tags
the attached
red tags, to the
related components
to the main of the
component virtual robot.
(Figure 3a),
The user could choose to display only the red tags, related to the main component (Figure
only the blue tags of the peripheral components (Figure 3b), or hide all the
3a), only the blue tags of the peripheral components (Figure 3b), or hide all the tags to tags to better
observe the model (Figure 3c).
better observe the model (Figure 3c).

(a) (b) (c)


Figure 3. Model main view: (a) Labels of main components; (b) Labels of peripherals; (c) No labels.
Figure 3. Model main view: (a) Labels of main components; (b) Labels of peripherals; (c) No labels.

To provide better visualization of the model, the application enables to observe sep-
arately each of the three layers of robot components: top, middle, and bottom. The top
layer displayed in Figure 4a includes the Lidar, monitor, and peripherals. The middle
layer includes the Jetson Nano onboard computer, RealSense camera, and peripherals,
displayed in Figure 4b. The lower layer is the car chassis (Figure 2a) consisting of the driv-
ing and steering mechanisms. The button in the upper right corner of the main view (Fig-
Robotics 2022, 11, 90 (a) (b) (c) 7 of 19

Figure 3. Model main view: (a) Labels of main components; (b) Labels of peripherals; (c) No labels.

To provide
To provide better
better visualization
visualization of the model,
of the model, the
the application
application enables
enables to
to observe
observe sep-
sep-
arately each
arately each ofof the
the three
three layers
layers ofof robot
robot components:
components: top,top, middle,
middle, and
and bottom.
bottom. The
The top
top
layer displayed in Figure 4a includes the Lidar, monitor, and peripherals. The middle
layer displayed in Figure 4a includes the Lidar, monitor, and peripherals. The middle layer
layer includes
includes the Nano
the Jetson Jetsononboard
Nano onboard
computer, computer,
RealSenseRealSense camera,
camera, and and peripherals,
peripherals, displayed
displayed in Figure 4b. The lower layer is the car chassis (Figure 2a) consisting of the driv-
in Figure 4b. The lower layer is the car chassis (Figure 2a) consisting of the driving and
steering mechanisms. The button in the upper right corner of the main
ing and steering mechanisms. The button in the upper right corner of the main view (Fig- view (Figure 3c)
opens
ure 3c)an option
opens to viewtoeach
an option viewofeach
the layers and get
of the layers extended
and information
get extended about about
information the main
the
components. In each of the layer views, the animation enables the user
main components. In each of the layer views, the animation enables the user to “explode” to “explode” the
layer intointo
the layer discrete components
discrete components and andassemble it back
assemble again
it back while
again controlling
while the distance
controlling the dis-
between the components using a slider that can be seen at the bottom
tance between the components using a slider that can be seen at the bottom of Figure of Figure 4a,b. 4a,b.

(a) (b) (c)


Figure 4.
Figure 4. Layer
Layer views:
views: (a)
(a) Top layer; (b)
Top layer; (b) Middle
Middle layer
layer fully
fully exploded;
exploded; (c)
(c) Exploratory
Exploratory page.
page.

In each
In each ofof the
the layer
layer views,
views, pressing
pressing onon any
any of
of the
the red
red labels
labels opened
opened an
an explanatory
explanatory
page consisting
page consistingofofdescriptions
descriptionsofof
thethe
principle of operation
principle of the
of operation of component and its
the component andfunc-
its
tion in the
function RACECAR.
in the RACECAR. Figure 4c 4c
Figure depicts thethe
depicts explanatory
explanatorypage
pagedisplayed
displayedwhen
when pressing
pressing
the label
the label of
of the
the RealSense
RealSense depth
depth camera.
camera.

2.5. The
2.5. The Robotics
Robotics Workshop
Workshop
The 6-h
The 6-h workshop
workshopwas wasconducted
conductedasaspartpartofofthe
theIntroduction
Introduction toto Industrial
Industrial Engineer-
Engineering
ing and
and System
System Integration
Integration (IIESI)
(IIESI) course.
course. TheThe workshop
workshop included
included a 2-ha 2-h introductory
introductory lec-
lecture
ture and two 2-h practical sessions, all given online because of the social distancing re-
Robotics 2022, 11, x FOR PEER REVIEW
and two 2-h practical sessions, all given online because of the social distancing 8 of 20
restrictions.
strictions.
The outlineThe outline
of the of the workshop
workshop is presentedis in
presented
Figure 5in andFigure 5 and isbelow.
is described described below.
The lecture consisted of three parts. The first part briefly introduced the subject area
of industrial engineering, the concept of the Fourth Industrial Revolution (Industry 4.0),
and autonomous robotics in virtual and augmented realities.

Figure 5. Workshop outline.


Figure 5. Workshop outline.
In the second part, we presented the TurtleBot2 mobile robot, the robot operating
system ROS, and the RGB-D camera. Then, we provided guidelines on how to prepare for
the lab sessions: install the Vuforia View software on the mobile device, run the AR app,
and access the instructional resources. After that, we presented the three activities in-
cluded in the AR experience with TurtleBot2 [31]: disassembling the robot into compo-
Robotics 2022, 11, 90 8 of 19

The lecture consisted of three parts. The first part briefly introduced the subject area
of industrial engineering, the concept of the Fourth Industrial Revolution (Industry 4.0),
and autonomous robotics in virtual and augmented realities.
In the second part, we presented the TurtleBot2 mobile robot, the robot operating
system ROS, and the RGB-D camera. Then, we provided guidelines on how to prepare for
the lab sessions: install the Vuforia View software on the mobile device, run the AR app, and
access the instructional resources. After that, we presented the three activities included in
the AR experience with TurtleBot2 [31]: disassembling the robot into components, attaching
a basket on top of the robot, and replacing the single board computer.
At the beginning of the third part, we introduced the concept of smart transportation
and presented the RACECAR MN robot system and its main components: NVIDIA Jetson
Nano computer, Intel RealSense depth camera, LIDAR sensor, and PWM motor controller.
Then, the AR experience, which included disassembling the robot and learning about its
components, was presented. Finally, the guidelines on how to install and use the 3D driving
simulator RacecarSim were provided.
The students, who stayed at home, used the Vuforia View app. on their smartphones
and laptops to set up the augmented workspace we created using Vuforia Studio. In the
practical sessions, the students were engaged in experiential learning of two robot systems:
TurtleBot 2 and RACECAR MN.
The first session offered the augmented reality experience with TurtleBot2 that we
developed and first implemented in 2020 [31]. The students first set up their AR workspace
using personal mobile devices and laptops. Then they performed exercises specified in the
worksheet and answered the quiz questions.
In the first exercise, the students virtually dismantled TurleBot2, learned its main
components, and composed the block diagram of the robot system. Then, the students
virtually assembled the robot, composed a to-do list for assembling its motion sub-system,
and answered questions related to the robot components and their functions. In the second
exercise, the assignment was to attach to the robot a suitable basket container for cargo trans-
portation. The students analyzed the available containers and chose one that can be firmly
assembled on top of the robot. The assignment for the third exercise was to replace the Rasp-
berry Pi computer of TurtleBot2 with one from the list of available single board computers
(SBCs). The students observed the characteristics of the boards presented in the worksheet,
compared them with the characteristics of the Raspberry Pi displayed on the mobile device,
and selected the SBC which had characteristics compatible with those of the Raspberry Pi.
The second session implemented our new online lab, composed of experiences with
RACECAR MN in virtual and augmented realities. In the virtual experience, the students
operated the RacecarSim simulator and used the keyboard to manually navigate the robot
and explore the surroundings in different scenarios. Throughout the tasks, the students
monitored the outputs of all robot sensors and observed how they changed dynamically
during robot movement. Figure 6a depicts the simulator screen in which the virtual racecar
traverses the track, while the three monitors on the left side of the screen display the
readings of the virtual LIDAR sensor, depth camera, and color camera. The LIDAR monitor
presents the robot’s position by a green triangle in the center of the circle, while obstacles
in the surrounding are represented by red lines. The depth camera monitor displays an
image, in which distances to the objects in front of it are represented by different colors.
during robot movement. Figure 6a depicts the simulator screen in which the virtual race-
stacles in the surrounding
car traverses aretherepresented
the track, while by on
three monitors redthe
lines. Theofdepth
left side camera
the screen monitor
display the dis-
plays an image, in which distances to the objects in front of it are represented
readings of the virtual LIDAR sensor, depth camera, and color camera. The LIDAR mon- by different
colors.
itor presents the robot’s position by a green triangle in the center of the circle, while ob-
stacles in the surrounding are represented by red lines. The depth camera monitor dis-
Robotics 2022, 11, 90 9 of 19
plays an image, in which distances to the objects in front of it are represented by different
colors.

(a) (b)
Figure 6. The robot: (a) In virtual reality; (b) In augmented reality.
(a) (b)
Figure
The6.6.learning
Figure The
The robot: (a)
(a)In
tasks
robot: virtual
virtualreality;
Infocused (b)
(b) In
In augmented
on understanding
reality; reality.
augmentedthe data
reality. from the virtual robot sensors
provided by the simulator. In the first step, the students drove the virtual robot and fig-
The
The learning
learning tasks
tasks focused
focused onon understanding
understanding the the data
data from
from thethe virtual
virtual robot
robot sensors
sensors
ured out
provided
the maximal
by the
velocity
the simulator.
simulator.In Inthe
indicated
thefirst
firststep,
by
step,the
the inertial
thestudents
students
measurement
drove
unit (IMU) in the
provided by drove thethe virtual
virtual robotrobot
andand fig-
figured
topured
right
out the of
out themaximal
the
maximal simulator
velocity screen.
velocity Then,
indicated
indicated by the they
by calibrated
the
inertial inertial the LIDAR
measurement
measurement and
unit (IMU)unitdepth
in(IMU) camera
the top the read-
inright
ings,
top appearing
right
of the with no
of the simulator
simulator scales
screen. screen. or units. To calibrate
Then,calibrated
Then, they they calibrated the LIDAR
the LIDAR
the LIDAR and and sensor,
depth depth the
camera students
camera read-drove
readings,
theings,
robot
appearing with
appearing
withthe
nomaximal
with no scales
scales velocity
or units.
or units. towards
To calibrate
To calibrate a distant
the the LIDAR
LIDAR object. They
sensor,
sensor, the the measured
students
students drove the
drovethe time
between
the
robot thewith
robot
with moment
the maximal when
the maximal the LIDAR
velocity
velocity towards detects
towards
a distanta the object
distant
object. and
object.
They displays
They
measured the
measured
the redthe
time line
time
between seen on
topbetween
of moment
the the moment
the LIDAR when thewhen
monitorLIDARinthe LIDAR
Figure
detects 7a,
thedetects
until
object the
the
and object
moment and
displayswhendisplays
the redthe the
robot
line redreaches
seen linetop
on seen onobject
ofthe
the
top
(Figure of 7f).
LIDAR the LIDARinmonitor
monitor in Figure
Figure 7a, 7a,moment
until the until thewhenmoment whenreaches
the robot the robotthereaches the object
object (Figure 7f).
(Figure 7f).

(a) (b) (c) (d) (e) (f)


(a) (b) (c) (d) (e) (f)
Figure 7. The color scale of the depth camera readings at different distances: (a) 4 m, (b) 3 m, (c) 2
Figure
m, (d)7.1The
Figure color
7.m,The
(e) scale
0.5 m,
color (f) 0of
scale ofm.the depthcamera
the depth camera readings
readings at different
at different distances:
distances: (a)(b)
(a) 4 m, 4 3m,m,(b)
(c) 32 m,
m, (c) 2
m, (d)
(d) 11m,
m,(e)
(e)0.5
0.5m,m,
(f)(f) 0 m.
0 m.

In the augmented experience, the students loaded the virtual model of RACECAR
MN on their mobile devices and placed it on the worktable in their physical environments
(Figure 6b). The students inspect the virtual robot from different viewpoints, disassembled
it, and learned about its main components from virtual tags. We focused learning through
AR experiences on the notion of the robot as a system of components, each of which
performs a certain function, and altogether provide the robot functionality. As most
students did not have prior knowledge of robotics, the workshop started with learning basic
concepts in the lecture, and then the students applied them to construct new understandings
during the practical sessions. The experience acquired from the practice with a simpler
TurtleBot2 system in the first session helped the students understand the more advanced
robot RACECAR MN.
Robotics 2022, 11, 90 10 of 19

We utilized the three scaffolding techniques noted in Section 1.3. The AR apps together
with the simulator provided presentation scaffolds that included a variety of tangible
visualizations and opportunities for the students to observe the robot systems from different
viewpoints and in different motion conditions. Conceptual scaffolds were given in the
lecture where all the studied concepts were consciously explained and in the exploratory
pages that were incorporated in the AR visualizations. Problem-solving scaffolds were
offered while the students were engaged in a series of practical problem-solving activities
with robot systems.
The workshop prompted the students to use integrative thinking in different ways.
Table 1 lists the workshop tasks and related learning activities, along with the applied
integrative thinking skills.

Table 1. Workshop tasks, learning activities, and applied integrative thinking skills.

Task Students’ Activities Applications of IT


Placing the virtual robot on the home table Creating an integrated view of real and
Set up a personal AR workspace
using the mobile device screen virtual objects
Exploring the robot structure, components, Creating a concept map of the
Make a block diagram of the robot system
and their interactions robot system
Determining the assembly sequence for the Creating a visual representation
Set the robot motion system assembly order
robot motion system of an assembly
Replacing the robot computer with a selected Selecting an item by analysis of its
Replace the on-board computer of the robot
alternative one technical characteristics
Upgrading the robot system by attaching a Selecting an item by analysis of its shape
Attach a container to the robot
suitable container and dimensions
Measuring distances in the simulated Creating a workspace image based on
Explore robot sensors and their fusion
environment using the robot sensors multi-sensor data
Determining the path and speed of robot Dynamic integration of spatial and
Navigate the robot to avoid obstacles
motion using sensor fusion kinematics data

The table indicates that the students applied IT skills, both visual and verbal, when
setting up and practicing in the AR environment (task 1), creating block diagrams of the
robot system (task 2), determining the order of assembly operations (tasks 3), replacing
existing and attaching new robot components (tasks 4 and 5), exploring sensor fusion
(task 6) and navigating the robot (task 7).

2.6. Evaluation of Learning Outcomes


We followed up the workshop and collected data from students’ answers to the two
online quizzes that were part of the learning activities directed by the worksheet. The
follow-up aimed to examine how the students learned through online practice with robot
models in virtual and augmented realities and how they evaluated the contribution of this
practice to training their integrative thinking.
The evaluation study involved 99 first-year students who participated in the workshop.
Most of them were 20–24 years old, 60% were female and 40% male students. Less than
13% of the students learned robotics before the course in school or other settings.

2.6.1. Data Collection


We collected and analyzed qualitative and quantitative data on the workshop out-
comes using the mixed method. We also collected students’ evaluations of practices with
TurtleBot and RACECAR MN robot systems and compared the evaluations by applying the
method of comparative case studies [37]. For data collection, we used two online learning
quizzes included in the worksheet: one related to the experience with TurtleBot2, and
another related to RACECAR MN. Moreover, the workshop evaluation questionnaire was
administered at the end of the workshop, as an online Google Form.
The quiz related to TurtleBot2 was the same as that developed and used in our
previous study [31]. It included twelve questions directed to evaluate the understanding
of the concepts acquired in the workshop and the ability to apply integrative thinking.
Robotics 2022, 11, 90 11 of 19

For example, one of the questions related to the robot motion system by exploring an AR
animation of disassembling/assembling the robot system. The students were asked to
compose a to-do list for assembling the motion system of TurtleBot2.
The quiz related to learning with RACECAR MN included ten questions. Questions
1–5 addressed the augmented reality experience, and questions 6–10 referred to robot
operation using RacecarSim. In question 1, the students were given an empty block
diagram of a robot and asked to specify in each block the name of the actual component of
RACECAR MN and indicate by arrows the connections between the robot components.
Questions 2–5 evaluated understanding the principles of operation of the main robot
components. For example, question 2 was as follows:
In the PWM motor controller of the RACECAR MN:
1. If the on-time of the pulses increases and the duty cycle decreases, then the speed of the
motor increases.
2. If the on-time of the pulses decreases and the duty cycle increases, then the speed of the
motor increases.
3. If the on-time of the pulses decreases and the duty cycle remains the same, then the speed of the
motor decreases.
4. If the on-time of the pulses remains the same and the duty cycle increases, then the speed of the
motor decreases.
Question 3 was devoted to the issue of the efficiency of the central and graphics
processing units. Questions 4 and 5 examined understanding of the functionality of the
depth camera and the LIDAR sensor. Questions 6–10 related to the tasks performed with
RacecarSim simulator. For example, in Question 8, the students were asked to estimate the
distance represented by each of the colors displayed by the depth camera by comparing
the colors to known distances determined by using the LIDAR sensor (Figure 7).
In the questionnaire, the students were asked to specify the used technological tools
(smartphone/tablet, Apple/Android, Vuforia-view/Video-substitute). They also shared
their reflections on the robots, the AR/VR technologies, and the difficulties in using them.
The students evaluated the contribution of the AR/VR experiences to learning about
TurtleBot2 and RACECAR MN. The students were requested to evaluate also to what
extent the practices with each robot helped them to understand it as an integrated system
and fostered their integrative thinking. The answers were given according to a five-point
Likert style scale, with 1 for ‘no contribution’ and 5 for ‘high contribution’. The students
were also asked to explain their answers.

2.6.2. Data Analysis


Data analysis focused on the research questions of the study. To answer the first
research question, we evaluated students’ understanding of robot organization and func-
tionality based on the scores that they got on the two quizzes. To evaluate students’
performance on the quizzes, for each quiz we designated weights to its questions and
developed rubrics for evaluating the answers. For example, the rubric for the question
about the block diagram of TurtleBot2 and RACECAR MN (included in each of the two
quizzes) evaluated to what extent the diagram correctly presented all the components and
the connections among them. In the answers to the question on the TurtleBot2 motion
system, we evaluated the presence of all relevant components and the correctness of their
order in the assembly sequence. In the analysis of the answers to multiple-choice questions
related to practice with virtual RACECAR MN, we evaluated if the answers selected by the
students are correct.
Part of the questions in the workshop evaluation questionnaire related to the first
research question and addressed students’ reflections on how the VR/AR environment
supported understanding the robot systems. We analyzed the use of hardware and software
tools, categorized the difficulties experienced by the students, and summarized students’
evaluations of the contribution of the online practice to learning the robots and their
Robotics 2022, 11, 90 12 of 19

components. The students’ evaluations of the current workshop were compared with those
of the previous workshops [31].
To answer the second research question, we considered students’ answers to the
questions of the workshop evaluation questionnaire, in which the students evaluated
how the learning activities contributed to the development of the determined integrative
thinking skills. Statistical analysis was performed on the responses to the multiple-choice
questions, including a test for correlations between the scores on different aspects of the
workshop contribution. The qualitative responses were used to verify and explain the
quantitative results. The results obtained for each of the robots were compared.

3. Results
3.1. Workshop Assignments
Regarding the first research question, all the students successfully performed the
workshop assignments and answered the questions of the related quizzes. Students’ mean
scores on the TurtleBot2 and RACECAR MN quizzes were 92.8 and 94.6 correspondingly.
A lower percentage of correct answers was obtained for the question on the TurtleBot2
motion system in the first quiz (49.4) and the RACECAR MN block diagram question in
the second quiz (74.5). For other questions both quizzes scored above 97.

3.2. Technological Tools


The findings regarding the use of hardware and software tools by the students are
presented in Figure 8. As shown, 94 students noted their participation in the AR experience
in the workshop evaluation questionnaire. 85% of them practiced with the AR app through
Vuforia View and the rest reported that they failed to run the app for technical reasons,
and instead, watched the video recorded experience. Among the AR app users, 84% used
Robotics 2022, 11, x FOR PEER REVIEW 13 of 20
smartphones and the rest used tablets. Moreover, 53% of their mobile devices were run on
Apple OS and the rest were from other manufacturers, running Android.

Figure
Figure8.8.Use
Useofoftechnological
technologicaltools.
tools.

The
Thefindings
findingsindicate
indicatethat
thatall
allthe
thestudents
studentswho
whocould
couldnotnotrun runthe
theAR
ARappappand
andinstead
instead
watched
watchedthe thevideo
videowere
wereAndroid
Androidusers.users.Some
Someofofthe
thestudents
studentswho whosucceeded
succeededtotorunrunthe
the
app reported difficulties in using it. Specifically, 27% of the students reported difficultiesin
app reported difficulties in using it. Specifically, 27% of the students reported difficulties
in“placing”
“placing” the robot
the robotin in
thethe
workspace
workspace andand
viewing it from
viewing different
it from perspectives,
different and and
perspectives, prob-
problems with panning and zooming the image. Further, 14% of all the students partici-in
lems with panning and zooming the image. Further, 14% of all the students participated
the workshop
pated reportedreported
in the workshop that theythat
were challenged
they by online by
were challenged learning
onlineoflearning
the subject that
of the was
sub-
completely new for them.
ject that was completely new for them.

3.3. AR/VR Experience


Despite the difficulties, the students highly appreciated the contribution of the
VR/AR practice to understanding the robots and components. Indeed, 81% evaluated this
contribution as notable and 46% of them considered it as high or very high. The written
reflections support the quantitative results. The students noted several advantages of AR
experiences compared to teacher explanations:
Robotics 2022, 11, 90 13 of 19

3.3. AR/VR Experience


Despite the difficulties, the students highly appreciated the contribution of the VR/AR
practice to understanding the robots and components. Indeed, 81% evaluated this contribu-
tion as notable and 46% of them considered it as high or very high. The written reflections
support the quantitative results. The students noted several advantages of AR experiences
compared to teacher explanations:
“The AR experiences helped us understand the robot structure and components much
better than the verbal explanations.”
They acknowledged the opportunities for robot exploration provided by the AR app:
“Using AR, we were able to observe from different angles how the robots are built, just
as if we were looking at them in real life. In addition, the explanation given for each
component helped to understand the robot better.”
“By observing the robot being disassembled into parts and then being reassembled, we
were able to better understand the relationships between the different components and
their functions.”
The students also appreciated the practice with the RacecarSim. They wrote:
“Interesting simulation! Through it, we understood the importance of each of the sensors
and their meaning.”
“Working on the simulator helped us better understand the different components, such as
the laser sensor and the distance camera.”
While appreciating the AR/VR experience, many students noted that it cannot become an
equal substitute for practice with real robots. The repeated reflections were:
“Augmented reality allowed me to get to know the components well, but if they could be
physically handled, connected, and disconnected, that would be a great improvement.”

3.4. Difficulties in AR/VR Experience


Though most of the students succeeded to gain an understanding of the robot structure
and components from the AR experiences, several students reported on difficulties that
they experienced in this practice. Some of them related these difficulties to the lack of
technical background. One student wrote:
“I think we need more prior knowledge of the parts that are building the robot. I don’t
know about motors, etc., so the issue was difficult for me.”
Certain visualizations were not completely clear to the students for several reasons. One
was that the small smartphone screen made it difficult to see details. In this regard one of
the students noted:
“It is difficult to understand on a small screen of a cellphone where each part connects.”
Another difficulty was in observing disassembling and assembling components that had
the same color, as indicated in the following reflection:
“It was difficult to understand what each component was because they are all black.”
One more difficulty was that the components’ labels were not presented in the visualizations
of assembling and disassembling the robots. The relevant reflection:
“Because the explanatory labels were absent, it was difficult to understand how the parts
were connecting.”

3.5. Contribution to Understanding Robot Structure


We compared evaluations of the contribution of the workshop given in the current
study with those conducted in the previous studies, regarding the understanding of the
robot structure and components. As found, this contribution was evaluated as notable by
Robotics 2022, 11, 90 14 of 19

the same percentage of students (81%) from both workshops. However, the percentage
of students who evaluated the contribution as high was significantly larger in the current
workshop (46%) than in the past workshop (29%).

3.6. Understanding Robots as Integrated Systems


To answer the second research question, we focused on students’ evaluation of how
the different activities of the workshop engaged them in applying integrative thinking
skills. Students’ evaluations of the workshop are summarized in Table 2. The first column
lists the aspects of the workshop contribution. The second and third columns present
the percentage of the students who evaluated the workshop contribution to each learning
outcome as notable or high. Aspects 1–4 relate to integrative thinking about the studied
robot systems, while Aspect 5 addresses the possible integration of AR/VR experiences as
experiential activities in learning about robotics in industrial engineering.

Table 2. Students’ evaluations of the workshop contribution (%).

Aspects of the Contribution Contribution Level (%)


Notable High
Understanding the robot structure and its components 81 46
Understanding TurtleBot2 as an integrated system 77 40
Understanding RACECAR MN as an integrated system 91 60
Understanding the interactions among robot components 84 51
Experience that can be used for studying real robot systems 90 68

Table 3 presents Spearman correlation coefficients calculated to test for a relation-


ship between students’ evaluations of the workshop contribution for understanding the
robot structure and components (Aspect 1) and the three other aspects of the contribution
(Aspects 2–4). As found, the correlations were relatively strong with p < 0.001.

Table 3. Spearman’s correlations among the aspects of the contribution.

Understanding Understanding Understanding the


Aspects of the Contribution TurtleBot2 as an RACECAR MN as an Interactions among
Integrated System Integrated System Robot Components
Understanding the robot
0.657 0.584 0.398
structure and its components

As mentioned above and follows from Table 2, most students positively evaluated
the contribution of the AR and VR practices to understanding the robot structure and
components. From students’ reflections:
“The workshop gave me a lot of practice in integrative thinking.”
“The workshop helped me a lot in understanding the whole subject since I did not have
knowledge about robots before. I could see the robot falling apart and how it connects.
What each part is and to what category it belongs.”
In addition, 84% evaluated the contribution to understanding the interactions among
robot components as notable, and half of the students evaluated this contribution as high.
A typical reflection:
“As I observed the decomposition and assembly of the robots, I gained a better under-
standing of the relationships among the different components and their functions.”
Most students noted that the practices contributed to their understanding of how the
TurtleBot2 and RACECAR MN operate as integrated systems (77% and 91% accordingly). The
Wilcoxon signed ranks test indicated that the scores for understanding the RACECAR MN as
an integrated system were significantly higher (Z = −4.4 with p < 0.001) than that of TurtleBot2.
Robotics 2022, 11, 90 15 of 19

In their reflections, the students wrote that from first-hand experience with the two
instructional robots, they came to understand the basic principles of robot construction
and operation:
“By experimenting with the robots and exploring most of their parts, we were able to
generate more general ideas, such as how larger and more useful robots work, how they
are operated, and what their basic parts are. Thus, the private experience with a small
number of robots has contributed to understanding general ideas related to other robots.”
“The experience has made me think about the solutions robots can offer in many different
areas, and about streamlining the processes that using robots can bring.”

3.7. Student Evaluations Compared


We compared the student evaluations of the workshop given in the current study with
those in the previous study [31]. As found, the evaluation of the workshop contribution in
the current study was significantly higher than in the previous study. High evaluations
of the contribution were given by 46% vs. 29% for understanding the robot structure
and components, 40% vs. 14% for understanding the robot as an integrated system, and
68% vs. 45% for understanding the interactions among the components.

4. Discussion
In this paper, we explore the potential for online experiential learning about robot
systems in virtual and augmented reality environments, learning which is particularly
relevant in times of social distancing. The published studies evaluated the knowledge
gained through such educational processes but did not consider the integrative thinking
skills that are applied and can be fostered by this learning practice. Our study demonstrated
that such learning practice can facilitate the understanding of the structure and functionality
of robots, expose novice engineering students to novel digital technologies, and provide
rich opportunities for applying integrative thinking skills.
The study included technological, pedagogical, and educational research parts. In the
technological part, we developed an augmented reality experience with the RACECAR MN
robot system. Our motivation to develop this AR experience was to use it for learning the
structure of the system and the functionality of its components. This was required because
under the social distancing restrictions the students did not have access to the physical
robot and practiced only with the RacecarSim simulator which adequately represents robot
kinematics but not its structure.
We coped with three technological challenges. The first was creating an authentic
3D model which represents in all the details the whole RACECAR MN system and its
components. The second was creating the AR animation which allows to isolate each robot
subsystem and component, observe them from different viewpoints, and learn about them.
The third challenge was creating an interactive cloud-based AR experience for students to
explore the robot system.
In the pedagogical part of the study, we developed and conducted a workshop, in
which the students learned about the TurtleBot2 and RACECAR MN robot systems through
augmented reality experiences. Our motivation was to engage the students in the study of
robot systems through online experimentation in VR and AR. We had to cope with two
main pedagogical challenges. The first was teaching the advanced technological concepts
to students, most of whom did not have any background in robotics. The second was
on using the VR and AR experiences to teach about real robot systems that could not be
accessed physically.
In the educational research part, we examined the learning through online exploration
of robot systems in VR and AR and evaluated the contribution of this practice to fostering
integrative thinking, based on the students’ feedback. The motivation was to verify if novice
engineering students can understand the structure and functionality of robot systems
through VR and AR experiences and if this practice can foster their integrative thinking
skills. The first challenge in conducting this research was that we could not observe student
Robotics 2022, 11, 90 16 of 19

learning directly. Second, the students used a variety of mobile devices and computers,
some of which could not support the developed AR experiences. The diversity of the
devices and computers affected the learning process and its outcomes. The third challenge
was related to the lack of a widely accepted notion of integrative thinking in the literature
as well as the absence of tools for its assessment in the area of engineering systems.
To answer the first research question, we examined the results of the workshop quizzes
and students’ self-evaluations of the learning experience to determine whether they un-
derstood the structure and functionality of the studied robot systems. The collected data
indicate that most students comprehended the structure and functionality of the robot
systems they learned about. The students received high scores on both TurtleBot2 and
RACECAR MN knowledge quizzes and highly appreciated the contribution of the expe-
riences with the AR app and the RacecarSim simulator to understanding the structure,
components, and functionality of the robots. The students, however, noted that these expe-
riences cannot completely replace actual practice with robots. Some students mentioned
difficulties in working with augmented reality related to the lack of technical skills, using
incompatible smartphones, and misunderstanding certain visualizations. We will take
these comments into account when updating the AR apps.
To answer the second research question, we examined students’ reflections on the
use of integrative thinking during workshop activities. The students highly appreciated
the experience gained in learning the structure and functionality of the robots. The high
scores for the workshop’s contribution to the understanding of the two robots as integrated
systems and their relatively strong correlation with the overall contribution score confirm
that integrative thinking was a major component of the learning experience. The students
applied IT from the beginning of the learning activities when they set up a system con-
sisting of a mobile device, a laptop, and Vuforia View software, and used the AR app
to ‘place’ the virtual robot on the real home table. The students systematically analyzed
the robot structure, components, and their interactions using IT skills to make block di-
agrams of the robot systems. They explored disassembling and assembling the robots,
modified them, and performed maintenance operations. In the practice with RacecarSim
simulations, the students used integrated thinking to explore the dynamic behavior of the
virtual racecar. The high scores given by the students for the usefulness of the workshop
experience for studying real engineering systems indicate that through practice with the
instructional robots they grasped a basic understanding with regard to the architecture and
operation of robot systems.
The current version of the workshop differed from the two previously conducted
workshops described in Section 2.3. We compared results of the current and previous
workshops in the percentages of students who succeeded in the workshop tasks and highly
evaluated the workshop contributions. Results of the comparison are presented in Table 4.

Table 4. The students who succeeded in the workshop and highly evaluated its contributions (%).

First Workshop Second Workshop Third Workshop


Successfully
100% 86% Over 90%
performed the task
Contribution to – 83% 81%
learning the subject – Robotic manipulations Mobile robot system
Contribution to
78% 82% 90%
learning about
Robot-manipulators Robot-manipulators Mobile robots
robotics in IE
Contribution to 46% 85% Over 80%
training the skill Spatial skills Spatial skills Integrative thinking skills

The first row of Table 4 includes evaluation of students’ performance of the work-
shop tasks. Further, rows 2–4 present percentage of students who highly evaluated the
workshops’ contributions (scores 3–5 in the 5-point scale). The comparison of evaluations
indicates that the current version contributes as much as previous versions.
Robotics 2022, 11, 90 17 of 19

5. Conclusions
In conclusion, our study demonstrates the feasibility and effectiveness of experiential
learning of engineering systems through the exploration of their digital models in aug-
mented and virtual reality environments. There are still some technical difficulties with
the approach, but the results are promising. The approach becomes particularly suitable in
conditions of students’ isolation and social distancing restrictions. Despite the restrictions,
the use of AR, VR, and online technologies enabled to provide students with meaningful
practice in learning about the mobile robots. In our study, the knowledge quizzes indicated
that through the AR experience the students gained understanding of the robot systems
and their components. Through the practice with the RacecarSim simulator, the students
developed their robot operation skills and learned to navigate the robot based on the
information from different sensors.
Our study proposed to foster the integrative thinking skills of first-year engineering
students through practice with robot systems and showed a possible approach to such
a practice in AR and VR environments. From the theoretical perspective, the study con-
tributes to shaping the concept of student integrative thinking in the context of learning
about robot systems. Based on our study, we recommend further research and development
of AR experiences and virtual simulations that will combine the experiential learning of
engineering systems with the development of generic skills needed for modern engineering.
In the further R&D, robotics can serve as a testbed for developing pedagogical strate-
gies for online learning practice with engineering systems, for introducing students to the
innovative digital technologies, support learning STEM subjects, and develop students’
cognitive and social skills needed in the age of Industry 4.0.
A decade ago, Alimisis [38] analyzed the state of educational robotics and identified several
challenges it faces as a tool to develop students’ cognitive and social skills and support learning
STEM subjects. Here we formulate our perception of the new challenges that educational
robotics has to meet in the current times of digital transformation and the pandemic:
• Develop innovative technology-rich learning environments accessible for different
types of experiential practice with instructional models of modern robots.
• Transform learning activities that treat robots as black boxes into ones in which students
explore the robot’s architecture and functionality and use it as a platform for making.
• Prioritize learning activities that foster the cognitive and social skills required for the
use and development of modern engineering systems.
• Develop new programs in educational robotics based on the understanding of its
central role in preparing students for life in the modern technological world.
• Perform the assessment of learning outcomes, based on students’ progress in both
knowledge and skills, an integral part of educational robotics programs.
Our intentions for the robotics workshop and the educational study, as described in
Section 2.2, reflect our endeavor to meet these challenges. We call for wider involvement in
addressing the new challenges in educational robotics programs and research.

Author Contributions: Conceptualization and methodology, I.V. and D.C.; software, S.G.; validation
A.P.; formal analysis, I.V. and A.P.; investigation, D.C., S.G. and I.V.; data curation, A.P., S.G. and H.P.-
V.; writing—original draft preparation, I.V., D.C. and H.P.-V.; editing, I.V., D.C. and A.P.; supervision,
I.V.; project administration, I.V. and D.C. All authors have read and agreed to the published version
of the manuscript.
Funding: This research was funded by PTC Inc Grant 2022409. and the Technion Autonomous
Systems Program Grant 86600233.
Institutional Review Board Statement: The study was conducted in accordance with the Declaration
of Helsinki and approved by the Institutional Behavioral Sciences Ethics Committee of Technion–
Israel Institute of Technology (Approval no. 2020-103 date 8 December 2020). To carry out the study,
the authors applied and obtained permission from the Behavioral Sciences Research Ethics Committee
of the Technion (Approval number 2020-103).
Robotics 2022, 11, 90 18 of 19

Informed Consent Statement: Informed consent was obtained from all subjects involved in the
study. The goal and method of our study was explained to the participating students in advance.
The participants of this study were first-year engineering students from the Technion Faculty of In-
dustrial Engineering and Management. The participants’ identities were kept anonymous for all our
publications; no information was disclosed that can be used to recognize any specific individual. The
researchers collected research data with the consent of the course lecturer and with the per-mission
of the faculty. All the authors of the manuscript agreed with its content, gave explicit consent to
submit it, and obtained consent from the responsible authorities at the Technion—Israel Institute of
Technology where the work has been carried out.
Data Availability Statement: Data and materials will not be published.
Conflicts of Interest: The authors declare that the research was conducted in the absence of any
commercial or financial relationships that could be construed as a potential conflict of interest.

References
1. Huber, A.M.; Waxman, L.K.; Dyar, C. Using systems thinking to understand the evolving role of technology in the design process.
Int. J. Technol. Des. Educ. 2022, 32, 447–477. [CrossRef]
2. Zhou, N.N.; Deng, Y.L. Virtual reality: A state-of-the-art survey. Int. J. Autom. Comput. 2009, 6, 319–325. [CrossRef]
3. Gorman, D.; Hoermann, S.; Lindeman, R.W.; Shahri, B. Using virtual reality to enhance food technology education. Int. J. Technol.
Des. Educ. 2022, 32, 1659–1677. [CrossRef] [PubMed]
4. Hoenig, W.; Milanes, C.; Scaria, L.; Phan, T.; Bolas, M.; Ayanian, N. Mixed reality for robotics. In Proceedings of the 2015 IEEE/RSJ
International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015.
5. Song, P.; Yu, H.; Winkler, S. Vision-based 3D finger interactions for mixed reality games with physics simulation. In Proceedings
of the ACM SIGGRAPH International Conference on Virtual Reality Continuum and Its Applications in Industry, Singapore, 8–9
December 2008.
6. Ozdemir, M.; Sahin, C.; Arcagok, S.; Demir, M.K. The effect of augmented reality applications in the learning process: A
meta-analysis study. Eurasian J. Educ. Res. 2018, 18, 165–186. [CrossRef]
7. Thees, M.; Kapp, S.; Strzys, M.P.; Beil, F.; Lukowicz, P.; Kuhn, J. Effects of augmented reality on learning and cognitive load in
university physics laboratory courses. Comput. Hum. Behav. 2020, 108, 106316. [CrossRef]
8. Altmeyer, K.; Kapp, S.; Thees, M.; Malone, S.; Kuhn, J.; Brünken, R. The use of augmented reality to foster conceptual knowledge
acquisition in STEM laboratory courses—Theoretical background and empirical results. Br. J. Educ. Technol. 2020, 51, 611–628.
[CrossRef]
9. Singh, G.; Mantri, A.; Sharma, O.; Dutta, R.; Kaur, R. Evaluating the impact of the augmented reality learning environment on
electronics laboratory skills of engineering students. Comput. Appl. Eng. Educ. 2019, 27, 1361–1375. [CrossRef]
10. AlNajdi, S.; Alrashidi, M.; Almohamadi, K. The effectiveness of using augmented reality (AR) on assembling and exploring
educational mobile robot in pedagogical virtual machine. Interact. Learn. Environ. 2020, 28, 964–990. [CrossRef]
11. Borrero, A.; Márquez, J. A pilot study of the effectiveness of augmented reality to enhance the use of remote labs in electrical
engineering education. J. Sci. Educ. Technol. 2012, 21, 540–557. [CrossRef]
12. Verner, I.; Cuperman, D.; Polishuk, A. Inservice teachers explore RACECAR MN in physical and augmented environments. In
Proceedings of the 2022 17th Annual System of Systems Engineering Conference (SOSE), Rochester, NY, USA, 7–11 June 2022;
pp. 228–230.
13. TurtleBot2. Open-Source Robot Development Kit. Available online: https://www.turtlebot.com/turtlebot2/ (accessed on 28
August 2022).
14. MITLL RACECAR-MN. Available online: https://mitll-racecar-mn.readthedocs.io/en/latest/ (accessed on 28 August 2022).
15. Wang, X.; Mayer, R.E.; Zhou, P.; Lin, L. Benefits of interactive graphic organizers in online learning: Evidence for generative
learning theory. J. Educ. Psychol. 2021, 113, 1024–1037. [CrossRef]
16. Clark, D.; Linn, M.C. Designing for knowledge integration: The impact of instructional time. J. Learn. Sci. 2003, 12, 451–493.
[CrossRef]
17. Ausubel, D.P. A subsumption theory of meaningful verbal learning and retention. J. Gen. Psychol. 1962, 66, 213–224. [CrossRef]
[PubMed]
18. De Weck, O.L.; Roos, D.; Magee, C.L. Engineering Systems: Meeting Human Needs in a Complex Technological World; MIT Press:
Cambridge, MA, USA, 2011; pp. 168–184.
19. Shen, Z.; Zhu, Y. Complex engineering system learning through study of engineering failure cases using 3D animations. In
Proceedings of the ASEE 2011 Annual Conference and Exposition, Vancouver, BC, Canada, 26–29 June 2011.
20. Schörger, D.; Sewchurran, K. Towards an interpretive measurement framework to assess the levels of integrated and integrative
thinking within organizations. Risk Gov. Control Financ. Mark. Inst. 2015, 5, 44–66.
21. Kallio, E. Integrative thinking is the key: An evaluation of current research into the development of adult thinking. Theory Psychol.
2011, 21, 785–801. [CrossRef]
Robotics 2022, 11, 90 19 of 19

22. Martin, R.L. The Opposable Mind: How Successful Leaders Win through Integrative Thinking; Harvard Business School Publishing:
Boston, MA, USA, 2009.
23. Tynjälä, P.; Kallio, E.K.; Heikkinen, H.L. Professional expertise, integrative thinking, wisdom, and phronesis. In Development of
Adult Thinking, 1st ed.; Routledge: London, UK, 2020; pp. 156–174.
24. Qadir, J.; Yau, K.L.A.; Imran, M.A.; Al-Fuqaha, A. Engineering education, moving into 2020s: Essential competencies for effective
21st century electrical & computer engineers. In Proceedings of the 2020 IEEE Frontiers in Education Conference (FIE), Uppsala,
Sweden, 21–24 October 2022.
25. Quigley, M.; Conley, K.; Gerkey, B.; Faust, J.; Foote, T.; Leibs, J.; Wheeler, R.; Ng, A.Y. ROS: An open-source robot operating
system. In Proceedings of the ICRA Workshop on Open-Source Software, Kobe, Japan, 12–13 May 2009.
26. Pasek, Z.J. Helping engineers develop and exercise creative muscles. In Proceedings of the Canadian Engineering Education
Association Conference (CEEA), Toronto, ON, Canada, 4–7 June 2017.
27. Berlin, N.; Tavani, J.L.; Beasançon, M. An exploratory study of creativity, personality, and schooling achievement. Educ. Econ.
2016, 24, 536–556. [CrossRef]
28. Malik, A.; Setiawan, A. The development of higher order thinking laboratory to improve transferable skills of students. In
Proceedings of the 2015 International Conference on Innovation in Engineering and Vocational Education, Bandung, Indonesia,
14 November 2015.
29. Asok, D.; Abirami, A.M.; Angeline, N.; Lavanya, R. Active learning environment for achieving higher-order thinking skills in
engineering education. In Proceedings of the 2016 IEEE 4th International Conference on MOOCs, Innovation and Technology in
Education (MITE), Innovation, and Technology in Education (MITE), Madurai, India, 9–10 December 2016.
30. Rawat, K.S.; Massiha, G.H. A hands-on laboratory-based approach to undergraduate robotics education. In Proceedings of the
IEEE International Conference on Robotics and Automation, New Orleans, LA, USA, 26 Apr–1 May 2004.
31. Cuperman, D.; Verner, I.; Perez, H.; Gamer, S.; Polishuk, A. Fostering integrative thinking through an online AR-based robot
system analysis. In Proceedings of the World Engineering Education Forum, Madrid, Spain, 15–18 November 2021.
32. Ortega, P.E.; Lagoudas, M.Z.; Froyd, J.E. Overview and comparison of assessment tools for integrative thinking. In Proceedings
of the 2017 ASEE Annual Conference & Exposition, Columbus, OH, USA, 24–28 June 2017.
33. Kraiger, K.; Ford, J.K.; Salas, E. Application of cognitive, skill-based, and affective theories of learning outcomes to new methods
of training evaluation. Int. J. Appl. Psychol. 1993, 78, 311–328. [CrossRef]
34. Verner, I.M.; Gamer, S. Robotics laboratory classes for spatial training of novice engineering students. Int. J. Eng. Educ. 2015,
31, 1376–1388.
35. Verner, I.; Cuperman, D.; Gamer, S.; Polishuk, A. Exploring affordances of robot manipulators in an introductory engineering
course. Int. J. Eng. Educ. 2020, 36, 1691–1707.
36. Chen, S.; Fishberg, A.; Shimelis, E.; Grimm, J.; van Broekhoven, S.; Shin, R.; Karaman, S. A Hands-on Middle-School Robotics
Software Program at MIT. In Proceedings of the IEEE Integrated STEM Education Conference, Princeton, NJ, USA, 1 August 2020.
37. Goodrick, D. Comparative Case Studies; Methodological Briefs: Impact Evaluation 9; United Nations Children’s Fund (UNICEF):
Florence, Italy, 2014; pp. 1–17.
38. Alimisis, D. Educational robotics: Open questions and new challenges. Themes Sci. Technol. Educ. 2013, 6, 63–71.

You might also like