This paper explores a novel human–machine interaction (HMI) paradigm that utilizes the sensing, storage, computation, and communication (SSCC) power capabilities of mobile devices to provide intuitive interactions with dynamic systems. The HMI paradigm addresses the fundamental challenges by integrating computer vision, 3D virtual graphics, and touchscreen sensing to develop mobile apps that provide interactive augmented reality (AR) visualizations. While prior approaches used laboratory-grade hardware, e.g., personal computer (PC), vision system, etc., for streaming video to remote users, the approach exploits the inherent mobility of mobile devices to provide users with mixed-reality (MR) environments in which the laboratory test-bed and augmented visualizations coexist and interact in real-time to promote immersive learning experiences that don’t yet exist in engineering laboratories. By pointing the rear-facing cameras of the mobile devices at the system from an arbitrary perspective, computer vision techniques retrieve physical measurements to render interactive AR content or perform feedback control. Future work is expected to examine the potential of our approach in teaching fundamentals of dynamic systems, automatic control, robotics, etc. through inquiry-based activities with students.

References

1.
Rajkumar
,
R.
,
2012
,
A Cyber-physical Future
,
Proc. of the IEEE
,
100
, pp.
1309
1312
2.
Garber
,
L.
,
2013
,
Gestural Technology: Moving Interfaces in a New Direction
,
Computer
,
46
(
10
, pp.
22
25
3.
Frank, J.A., and Kapila, V., 2016, “Integrating Smart Mobile Devices for Immersive Interaction and Control of Physical Systems: A Cyber-Physical Approach,” Advanced Mechatronics and MEMS Devices II, Springer, to appear.
4.
Jara
,
C.A.
et al,
2011
Hands-on Experiences of Undergraduate Students in Automatics and Robotics using a Virtual and Remote Laboratory
,
Computers & Education
,
57
(
4
), pp.
2451
2461
5.
NYU, 2016, “Mobile Interfaces to Interact with Laboratory Test-Beds,” [Online]. Available: http://engineering. nyu.edu/mechatronics/videos/hmiarchs.html.
6.
Frank
,
J.A.
, and
Kapila
,
V.
,
2016
,
“Towards Teleoperation-based Interactive Learning of Robot Kinematics using a Mobile Augmented Reality Interface on a Tablet,”
Proc. Indian Control Conference, Hyderabad, India, pp. 385–392.
7.
Frank
,
J.A.
,
De Gracia Gomez
,
J.A.
, and
Kapila
,
V.
, 2015, “Using Tablets in the Vision-Based Control of a Ball and Beam Test-bed,”Proc. Int. Conf. Informatics in Control, Automation, and Robotics, Colmar, France, pp. 92–102.
8.
Frank
,
J.A.
,
Brill
,
A.
, and
Kapila
,
V.
, 2016, “Interactive Mobile Interface with Augmented Reality for Learning Digital Controls Concepts,” Proc. Indian Control Conference, Hyderabad, India, pp. 85–92
9.
Frank
,
J.A.
and
Kapila
,
V.
,
2014
,
Development of Mobile Interfaces to Interact with Automatic Control Experiments
,
IEEE Control Systems Magazine
,
34
(
52
), pp.
78
98
You do not currently have access to this content.