Button Menu

Computer Science Department

A Web Virtual Reality Tour of DePauw

Grant Skipper

This project will use cutting edge web technologies in a website that will allow users to treat their mobile phones as VR headsets to get a Virtual Reality tour of DePauw. Mozilla recently released an API for utilizing native smartphone gyroscope hardware through web applications to help render Virtual Reality experiences with WebGL. Users will be able to look at points of interest and be transported to the location in VR. 

This project will utilize an existing WebVR JavaScript framework A-Frame, with 360-degree equi-rectangular panorama photos taken from the “360 Panorama” application on iOS for crafting this experience. A-Frame leverages Three.js for the graphics component of the project. Three.js is a library for using WebGL in a more reasonable way. WebGL is the Internet equivalent of OpenGL (Open Graphics Language). Due to the primitive nature WebGL (as it is a relatively new technology), the use of Three.js makes utilizing this powerful tool much more approachable for an application developer. While Three.js is considered a more advanced JavaScript library, A-Frame provides a much easier way of working with it at a higher level. 

The Virtual Reality experience of this web-application is enhanced when combined with a ‘mobile-VR’ headset. Mobile-VR headsets act as a mount for phones and generally contain little to no circuitry / electronic hardware (in contrast to more ‘mainstream’ headsets like the Oculus Rift). What sets the Mobile-VR headset experience apart from just holding the phone close to the user’s face - is that not only does the headset create immersion by being hands-free, but also users can opt to have their experience rendered in ‘VR-optics’. In reality, we do not see things like one would generally imagine looking at a screen; we have two separate eyes (or lenses) that actually see the environment, and then the images are then seamlessly overlaid and combined in our mind, producing the image we “see”. In ‘VR’ mode the rendering can simulate a more ‘biological’ immersion by splitting the screen in half vertically and displaying slightly offset renderings of the same view, then, when placed in the headset, each on the user’s eyes see a separate half. The headset also allows for more immersion by the use of fish-eye lenses. The fish-eye produces a  depth effect for the environment. In this manner, by using the headset, a user is able to view the virtual environment in a simulated but more realistic way than just holding their phone close to their face and seeing the screen as is.