VR (Virtual reality) Studio: Media and Maker Commons

 Regular hours: Monday 1 - 5 pm; Tuesday - Friday 10am - 5pm; closed on holidays

The VR Studio is a space to experience and create augmented and virtual reality content. We use HTC Vive Pro with a wireless adaptor together with a high-end PC that offers the tools: SteamVR, Vive Port, Unity, Autodesk, and Adobe Suite.

VR studio

About

The Media and Maker Commons has one HTC Vive Pro VR System. This VR system has an optional wireless adaptor that allows you to move in VR without cables tethered to the computer.

Getting started

We offer online training through Canvas to teach you the minimum skills required to operate the HTC Vive Pro System. If you are not already familiar with VR, you will learn the basic principles needed to both experience and create a virtual environment.

Remember that after this training, you still need to do a hands-on training with a member of our staff. In that training, you will review basic principles and gain troubleshooting experience to avoid common AR/VR failures.

After you finished the online training, you can book a training slot (appointment) using the Canvas Calendar.

Tools

AR (augmented reality)/VR (virtual reality) hardware

The VR system available at the Media and Maker Commons is called a head-mounted display (HMD). It is an immersive type of VR that sits on top of your head, much like a helmet or goggles, and completely surrounds your view with computer graphics or video. You can look around the virtual world in 360 degrees. Some types of VR also include components that can track your movements to give you six degrees of freedom (6DoF) movement. Essentially, you can crouch down, look under or over objects, and move your head with corresponding VR movements. 

The particular VR system available is the HTC Vive Pro, which consists of an HMD, two controllers, two base stations, and a wireless adaptor. The HMD is tethered, meaning it is connected to the computer with a long cable to make use of the computer's computational power and run large or graphics-intensive projects. HTC Vive Pro makes use of the two base stations to provide 6DoF movement and controllers that are tracked in 3D space, so you can see your hands and interact with virtual objects. The optional wireless adaptor allows you move around freely without having to worry about tripping or getting tangled in cables, although the trade-off is the battery life lasts up to 2.5 hours.

AR is different from VR in that virtual objects are overlaid the real world, rather than your world completely replaced by the virtual. AR is primarily accessible through smartphone technology. While dedicated AR headsets, such as Magic Leap, exist, we do not yet have access to those at the Maker and Media Commons. 

AR/VR software

The software used to experience and create AR/VR at the Media and Maker Commons consists of Steam, Unity, Autodesk (Maya and 3ds Max), and Adobe (all apps package). 

Steam

Steam (and Steam VR) is a video game digital distribution platform developed by Valve Corporation. Steam VR is the part of Steam dedicated to VR games and experiences. You can purchase and experience VR content by opening up the Steam application on the computer. 

Unity

Unity is a real-time development platform. You can create 3D, 2D VR & AR visualizations for Games, Auto, Transportation, Film, Animation, Cinematics, Architecture, Engineering, Construction, and more! Unity works on Windows, Mac, and Linux and supports more than 25 different platforms, including mobile, desktop, consoles, and virtual reality. 

AR/VR platforms include:

  • Oculus Rift
  • Google Cardboard
  • Steam VR
  • PlayStation VR
  • Gear VR
  • Windows Mixed Reality
  • Daydream
  • Apple's ARKit
  • Google's ARCore
  • Vuforia
  • Magic Leap.

Unity has a range of artist-friendly tools for developing immersive experiences and worlds. Plus, there are also lots of developer tools to implement logic and a highly customized experience.  

You can open your Unity project, or start a new one, by opening the Unity Hub app on the computer and selecting the correct version of Unity for your project. Keep in mind, using a different or newer version of Unity than what you used before may cause your project to break. Always keep a back up version! 

Autodesk

Autodesk is a professional software service that includes Maya and 3ds Max, which are both used for 3D modeling, animation, and rendering. Maya and 3ds Max are really similar, but the main difference when considering which to choose is what operating system you have since 3ds Max is only compatible with Windows and Maya works on Windows, Linux, and OSX. Some developers say Maya is best for character animation and 3ds Max has a robust modeling toolkit. 

Adobe 

Adobe apps are focused on multimedia and creativity software, which include Photoshop, Illustrator, and Premier Pro. You can use these tools to help create your AR/VR project by making your own textures, materials, graphics, and 360/VR project videos. The software itself has built-in tutorials with example projects, so you can follow along and try it out for yourself! 

Examples

AR/VR is a powerful and versatile medium because of its immersive properties and ability to generate impossible or improbable worlds. Here are a few examples of how people have used AR/VR so far. You can experience these for yourself or get inspired to create your own reality.

Arts

Education

Games

Healthcare

Journalism

Social

AR/VR at SFU

There are many SFU researchers using AR/VR in their work. Check out their websites for more info!

iVizLab, Dr. Steve DiPaola
iVizLab is focused on AI based computational models of human characteristics such as expression, emotion, behavior, empathy and creativity; including computer graphics based facial/character systems and AI based cognitive modelling systems.

eBrain Lab, Dr. Faranak Farzan
eBrain Lab is currently heavily focused on addressing lack of early diagnostic and effective therapeutic solutions for youth mental health and addiction recovery using various forms of neurotechnologies, data mining techniques, virtual/augmented reality, behavioral assessments, and computerized behavioral training.

The Pain Studies Lab, Dr. Diane Gromala
The Pain Studies Lab is a research group and physical research space funded by the Canada Foundation for Innovation (CFI) at Simon Fraser University’s School of Interactive Arts and Technology (SIAT).

Spatial Interface Research Lab, Dr. Nick Hedley
SIRL is a research lab dedicated to geospatial interface research and geovisualization research, based in the Department of Geography.

Making Culture Lab, Dr. Kate Hennessy
Making Culture Lab explores the collaborative development and evaluation of culturally specific applications of new media in public space, museums and communities, both online and on the ground.

SNF New Media Lab, Dr. Dimitris Krallis
SNF New Media Lab brings together revolutionary pedagogy with cutting-edge technologies, to the preservation of the Modern Greek language in the Diaspora, for generations to come.

Educational Technology and Learning Design, Dr. Paula MacDowell
Dr. Paula MacDowell is a Limited Term Lecturer in the Faculty of Education. As a design and technology specialist, Dr. MacDowell is working on research and advocacy initiatives to empower children and youth through education and technology. She has used AR assignments as a teaching tool.

Metacreation Lab, Dr. Philippe Pasquier
Metacreation Lab aims to investigate the theory and practice of computational creativity, through the development of artificially creative musical systems (metacreations) and computational systems for supporting musical creativity.

Experimental Robotics Lab, Dr. Shahram Payandeh
ERL developed a virtual environment for helping to train surgeons in performing laparoscopic surgery, using both 3D and 5D input devices and devices generating haptic feedback. Now, ERL is shifting toward developing aging in place technologies.

iSPACE Lab, Dr. Bernhard Riecke
iSPACE Lab uses knowledge from cognitive psychology to design novel, more effec­tive human-computer interfaces and inter­ac­tion par­a­digms that enable sim­i­larly effec­tive processes in computer-mediated envi­ron­ments like vir­tual real­ity (VR), immer­sive gaming, and multi-media.

Bioinformatics Visualization Lab, Dr. Chris Shaw
Dr. Shaw is the co-author of the first virtual reality application (MR Toolkit) and has created numerous virtual environments and software for medical applications through collaborations at Johns Hopkins medical school and the Centers for Disease Control (CDC). His current other areas of expertise are in the areas of bioinformatics, visual analytics and two-handed interfaces for 3D applications.

VVISE Lab, Dr. Wolfgang Stuerzlinger
The VVISE group spans the fields of Human-Computer Interaction, Virtual and Augmented Reality, Visual and Immersive Analytics, large displays, 3D User Interfaces, and both software and hardware systems.

SFU courses

Looking to learn more about AR/VR? Check out these courses offered right here at SFU:

FCAT Semester in Alternate Realities
Named one of SFU’s coolest courses by Maclean’s, this full-time 15-credit course is aimed at addressing real-world problems using xR technologies and documented reflection. Project descriptions and videos.

IAT 443: Interactive Video
Students explore video within technologies ranging from cell phones and mobile locative media, and hand held and wearable devices, to 3D immersive virtual and/or networked environments, video art installations, multiple scales of display technology, and responsive spaces.

IAT 445: Immersive Environments
In this project-based course, you will use an immersion framework to design, create, and evaluate immersive virtual environments and the interaction between the user and the virtual environment. Past project videos.

GEOG 457: Geovisualization Interfaces
Students will all gain practical experience in 3D geovisualization development, using: serious game engines; 3D physics-based geo-simulations; mobile virtual reality (VR) (Oculus Go; Samsung Gear VR); single-user VR (Oculus Rift; HTC Vive); room-scale immersive VR (HTC Vive); tangible augmented reality (TAR); mobile augmented reality (MAR); augmented GIS and geovisual analytics; photospheres and videospheres for geographic narratives.

IAT 833: Performance, Technology and Embodiment
Students investigate interactive performance as an emerging practice-based area of research. Grounds performance practice in a variety of fields including: human computer interaction, phenomenology, artificial intelligence, embodied cognition and computation, computer games, and virtual worlds.

IAT 848: Mediated, Virtual, and Augmented Reality
This course covers the emerging field of virtual, augmented, mediated, and mixed reality from human-centered, research, technical, and ethical perspectives. Discusses and analyzes design, development, usage, and evaluation of technologies that can be used to mediate human experience and interaction with virtual and real environments including Virtual, Augmented, and Mixed Realities (together known as XR). 

EDUC 890: Educational Media as Foundation of Curriculum
Students will examine the affordances and constraints of educational media currently in vogue, such as digital games, apps, wikis, blogs, podcasts, videos, virtual reality, and other interactive tools for teaching and learning. Students will apply the knowledge learned in this course by developing a technology-enabled system, curriculum, or artifact of some kind to address a problem of learning in a setting that is important to them.