With design teams becoming more distributed, the sharing and interpreting of complex data about design concepts/prototypes and environments have become increasingly challenging. The size and quality of data that can be captured and shared directly affects the ability of receivers of that data to collaborate and provide meaningful feedback. To mitigate these challenges, the authors of this work propose the real-time translation of physical objects into an immersive virtual reality environment using readily available red, green, blue, and depth (RGB-D) sensing systems and standard networking connections. The emergence of commercial, off-the-shelf RGB-D sensing systems, such as the Microsoft Kinect, has enabled the rapid three-dimensional (3D) reconstruction of physical environments. The authors present a method that employs 3D mesh reconstruction algorithms and real-time rendering techniques to capture physical objects in the real world and represent their 3D reconstruction in an immersive virtual reality environment with which the user can then interact. Providing these features allows distributed design teams to share and interpret complex 3D data in a natural manner. The method reduces the processing requirements of the data capture system while enabling it to be portable. The method also provides an immersive environment in which designers can view and interpret the data remotely. A case study involving a commodity RGB-D sensor and multiple computers connected through standard TCP internet connections is presented to demonstrate the viability of the proposed method.

References

References
1.
Oculus, 2016, “Oculus Touch,” Oculus VR LLC, Irvine, CA, accessed Jan. 09, 2016, https://www.oculus.com/en-us/
2.
SteamVR, 2016, “SteamVR,” Valve Corporation, Bellevue, WA, accessed Jan. 09, 2016, http://store.steampowered.com/universe/vr
3.
PlayStation VR, 2016, “PlayStationVR,” Sony Interactive Entertainment LLC, San Mateo, CA, accessed Jan. 09, 2016, https://www.playstation.com/en-au/explore/ps4/features/playstation-vr/
4.
Rudarakanchana
,
N.
,
Van Herzeele
,
I.
,
Bicknell
,
C. D.
,
Riga
,
C. V.
,
Rolls
,
A.
,
Cheshire
,
N. J.
, and
Hamady
,
M. S.
,
2014
, “
Endovascular Repair of Ruptured Abdominal Aortic Aneurysm: Technical and Team Training in an Immersive Virtual Reality Environment
,”
Cardiovasc. Interventional Radiol.
,
37
(
4
), pp.
920
927
.
5.
Sacks
,
R.
,
Perlman
,
A.
, and
Barak
,
R.
,
2013
, “
Construction Safety Training Using Immersive Virtual Reality
,”
Constr. Manage. Econ.
,
31
(
9
), pp.
1005
1017
.
6.
Bednarz
,
T.
,
James
,
C.
,
Widzyk-Capehart
,
E.
,
Caris
,
C.
, and
Alem
,
L.
,
2015
, “
Distributed Collaborative Immersive Virtual Reality Framework for the Mining Industry
,”
Machine Vision and Mechatronics in Practice
,
Springer
, Berlin, pp.
39
48
.
7.
Larsson
,
A.
,
2003
, “
Making Sense of Collaboration: The Challenge of Thinking Together in Global Design Teams
,”
International ACM SIGGROUP Conference on Supporting Group Work
, pp.
153
160
.
8.
Nguyen
,
C. V.
,
Izadi
,
S.
, and
Lovell
,
D.
,
2012
, “
Modeling Kinect Sensor Noise for Improved 3D Reconstruction and Tracking
,”
Second International Conference on 3D Imaging, Modeling, Processing, Visualization and Transmission
(
3DIMPVT
), pp.
524
530
.
9.
Tucker
,
C. S.
,
John
,
D. B. S.
,
Behoora
,
I.
, and
Marcireau
,
A.
,
2014
, “
Open Source 3D Scanning and Printing for Design Capture and Realization
,”
ASME
Paper No. DETC2014-34801.
10.
Azuma
,
R.
,
Baillot
,
Y.
,
Behringer
,
R.
,
Feiner
,
S.
,
Julier
,
S.
, and
MacIntyre
,
B.
,
2001
, “
Recent Advances in Augmented Reality
,”
IEEE Comput. Graphics Appl.
,
21
(
6
), pp.
34
47
.
11.
Fernando
,
R.
, and
Kilgard
,
M. J.
,
2003
,
The Cg Tutorial: The Definitive Guide to Programmable Real-Time Graphics
,
Addison-Wesley Longman Publishing
, Boston, MA.
12.
Roth
,
H.
, and
Vona
,
M.
,
2012
, “
Moving Volume KinectFusion
,” Proceedings of the British Machine Vision Conference (
BMVC
), pp.
112.1
112.11
.
13.
Whelan
,
T.
,
Kaess
,
M.
,
Fallon
,
M.
,
Johannsson
,
H.
,
Leonard
,
J.
, and
McDonald
,
J.
,
2012
, “
Kintinuous: Spatially Extended Kinectfusion
,”
Report No. MIT-CSAIL-TR-2012-020
.
14.
Cutting
,
J. E.
,
1997
, “
How the Eye Measures Reality and Virtual Reality
,”
Behav. Res. Methods, Instrum., Comput.
,
29
(
1
), pp.
27
36
.
15.
Newcombe
,
R. A.
,
Izadi
,
S.
,
Hilliges
,
O.
,
Molyneaux
,
D.
,
Kim
,
D.
,
Davison
,
A. J.
,
Kohi
,
P.
,
Shotton
,
J.
,
Hodges
,
S.
, and
Fitzgibbon
,
A.
,
2011
, “
KinectFusion: Real-Time Dense Surface Mapping and Tracking
,”
10th IEEE International Symposium on Mixed and Augmented Reality
(
ISMAR
), pp.
127
136
.
16.
Microsoft,
2013
, “
Kinect Fusion Explorer-WPF C# Sample
,” Microsoft, Redmond, WA, accessed Sept. 14, 2016, https://msdn.microsoft.com/en-us/library/dn193975.aspx
17.
Turner
,
E.
,
Cheng
,
P.
, and
Zakhor
,
A.
,
2015
, “
Fast, Automated, Scalable Generation of Textured 3d Models of Indoor Environments
,”
IEEE J. Sel. Top. Signal Process.
,
9
(
3
), pp.
409
421
.
18.
Vasudevan
,
N.
, and
Tucker
,
C. S.
,
2013
, “
Digital Representation of Physical Artifacts: The Effect of Low Cost, High Accuracy 3D Scanning Technologies on Engineering Education, Student Learning and Design Evaluation
,”
ASME
Paper No. DETC2013-12651.
19.
Hamzeh
,
O.
, and
Elnagar
,
A.
,
2015
, “
A Kinect-Based Indoor Mobile Robot Localization
,”
10th International Symposium on Mechatronics and Its Applications
(
ISMA
), pp.
1
6
.
20.
Epic Games, 2016, “What is Unreal Engine 4,” Epic Games, Inc., Cary, NC, accessed Sept. 14, 2016, https://www.unrealengine.com/what-is-unreal-engine-4
21.
Unity Technologies, 2016, “Unity—Game Engine,” Unity Technologies, San Francisco, CA, accessed Sept. 14, 2016, https://unity3d.com/
22.
Microsoft, 2014, “Kinect Hardware,” Microsoft, Redmond, WA, accessed Sept. 14, 2016, https://developer.microsoft.com/en-us/windows/kinect/hardware
23.
Joint Photographic Experts Group, 1994, “JPEG—JPEG,” Joint Photographic Experts Group Committee, accessed Sept. 14, 2016, https://jpeg.org/jpeg/index.html
24.
EmguCV, 2016, “Emgu CV: OpenCV in.NET (C#, VB, C++ and More),” EmguCV, accessed Sept. 14, 2016, http://www.emgu.com/wiki/index.php/Main_Page
25.
Curless
,
B.
, and
Levoy
,
M.
,
1996
, “
A Volumetric Method for Building Complex Models From Range Images
,”
23rd Annual Conference on Computer Graphics and Interactive Techniques
, pp.
303
312
.
26.
Advanced Micro Devices, 2016, “RadeonTM R9 Series Graphics Cards|AMD,” Advanced Micro Devices, Inc., Sunnyvale, CA, accessed Sept. 14, 2016, http://www.amd.com/en-us/products/graphics/desktop/r9#
You do not currently have access to this content.