During summer 2014, Mike interned at the Shared Reality Lab at McGill University. When he arrived, the  Compiz-based software projector calibration system for the haptic floor of their multi-modal CAVE was broken. Mike replaced that calibration system with an Unity-based system, and reengineered the visual channel and visual-haptic interfaces of the CAVE. As a result, projectors could be calibrated, and responsive virtual worlds (e.g. landscapes containing forests, lakes, snow, ice, particle systems) could be displayed again. Even footprints could be displayed after stepping on the floor.

Summer Undergraduate Research in Engineering (SURE) Program Poster (~80MB)

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s