Abstract. Research in human spatial cognition has benefited from the advent of virtual environment technology [1], including the Cave Automatic Virtual Environment [2]. Eye tracking has also become a common method for addressing different topics in spatial cognition research [3]. Although CAVEs with real time gaze interaction [4] have been constructed for research in social interaction [5], such as the EyeCVE [6], [7], this type of system has not been designed for research in spatial cognition. We present a MiddleVR and Unity based framework that enables real time gaze interaction with the CAVE and lays the foundation for future experiments in spatial cognition. Users of this framework will benefit from the realistic physics and rendering capabilities of the Unity game engine as well as the configurability of hardware (e.g., control devices, trackers) using MiddleVR.

This semester thesis was conducted with the Chair of Cognitive Science and Computer Graphics Laboratory of ETH Zurich. Supervised by Iva Barisic, Dr. Tyler Thrash, Prof. Dr. Christoph Hölscher and Prof. Dr. Markus Gross. Thesis available upon request.

References:

[1] D.Waller, E. Bachmann, E. Hodgson, and A. C. Beall, “The HIVE: A huge immersive virtual environment for research in spatial cognition,” Behavior Research Methods, vol. 39, no. 4, pp. 835–843, 2007.
[2] C. Cruz-Neira, D. J. Sandin, T. A. DeFanti, R. V. Kenyon, and J. C. Hart, “The CAVE:
audio visual experience automatic virtual environment,” Communications of the ACM,
vol. 35, no. 6, pp. 64–72, 1992.
[3] S. Schwarzkopf, R. von Stülpnagel, S. J. Büchner, L. Konieczny, G. Kallert, and
C. Hölscher, “What lab eye-tracking tells us about wayfinding. a comparison of stationary
and mobile eye-tracking in a large building scenario,” in 1st Intl. workshop on eye tracking for spatial research, ET4S, 2013.
[4] J. L. Gabbard, K. Swartz, K. Richey, and D. Hix, “Usability evaluation techniques: a novel method for assessing the usability of an immersive medical visualization ve,” SIMULATION SERIES, vol. 31, pp. 165–170, 1999.
[5] J. Ciger, B. Herbelin, and D. Thalmann, “Evaluation of gaze tracking technology for social interaction in virtual environments,” in Proc. of the 2nd Workshop on Modeling and Motion Capture Techniques for Virtual Environments (CAPTECH’04), Citeseer, 2004.
[6] W. Steptoe, R. Wolff, A. Murgia, E. Guimaraes, J. Rae, P. Sharkey, D. Roberts, and
A. Steed, “Eye-tracking for avatar eye-gaze and interactional analysis in immersive collaborative virtual environments,” in Proceedings of the 2008 ACM conference on Computer supported cooperative work, pp. 197–200, ACM, 2008.
[7] D. Roberts, R. Wolff, J. Rae, A. Steed, R. Aspin, M. McIntyre, A. Pena, O. Oyekoya,
and W. Steptoe, “Communicating eye-gaze across a distance: Comparing an eye-gaze enabled immersive collaborative virtual environment, aligned video conferencing, and being together,” in Virtual reality conference, 2009. VR 2009. IEEE, pp. 135–142, IEEE, 2009.

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s