Skip to main content

Reality Check: Steve Pardo on AR/VR and Sound

Q: First of all, can you explain the difference between AR and VR?
A: While they share most of the same underlying technologies and research, the mediums are experientially quite different. AR, or augmented reality, takes in the outside world and places digital objects and user interface top of it. For example, the iPhone, utilizing its onboard camera and AR Kit (a first-party software development library), can detect the dimensions of a room and can place a 3D object within the space, displaying the result on the screen. With VR, or virtual reality, your field of view in the headset is taken up completely by the digital render, allowing you to feel as if you are in a brand-new place, becoming fully immersed in this new world.

Q: You teach a course at MIT on AR/VR and audio. Can you talk a little bit about that class and how it came to be?
A: As this is still relatively uncharted territory, we started the semester as a pie-in-the-sky conversation and survey from many points of view and special guests. But, as the course is a combination of MIT students and Berklee students, we thought it would be very special to have a music focus from the onset.

We hit our stride during the middle of the class and kind of did both the technical and the theoretical. We would also share what we were working on and the current state of projects and any interesting happenings in the VR world. By the end of the course, all of the students were paired off into teams – not unlike the startup world. There were five to seven teams and they even had a product of their own. By the end of it, it really felt like everyone had a feel for the scope of AR/VR and was getting their hands dirty.

Q: Northwestern is offering a similar class during the winter quarter called Virtual Audio Production, and it will blend the technical and the theoretical of AR/VR. How does your course tackle the relationship between AR/VR and sound?
A: Early in the semester, we surveyed the audio technology available at the time as well as giving some examples of how I and other audio designers are utilizing them in our games and applications. Given that most of these techniques are brand new, even to most experienced sound designers, we wanted to focus on practical development for those that were interested. Once teams were formed and projects were underway, we encouraged the students to design experiences that would be enhanced by audio in some given way. Students were guided one on one by myself and other assistants in order to get the most out of the talent in the room. Some great examples we saw were an AR music visualizer concert experience and a VR Shakespearean play.

Q: How does sound factor into the AR/VR experience?
A: Initially, for just an everyday VR experience, whether it be a game, a 360-degree movie, or interactive educational experience, the sound has the potential to be much more spatialized and acoustically accurate than it would be on standard loud-speakers. There are new technologies emerging and plugins that the sound designer can use in order to create a more realistic experience, primarily utilizing an algorithm known as the HRTF (head related transfer function), which renders spatialized audio binaurally. If something is happening right in front of you, you’ll have the sensation located at that space with hyper-accuracy.

One thing we’ve noticed in VR development is that you have these new affordances that allow for you to interact with the space in new ways. It opens up the possibility-space with creating music such as performing on a musical instrument. At Harmonix, we are constantly experimenting on new ways to do just that, which you can see in Rock Band VR, which shipped on the Oculus Rift in 2017.

Q: What are some of the ways that Harmonix and other companies like it are making AR/VR more accessible to those outside of the sound industry?
A: We are working hard to bring interactive musical experiences to the masses. The Rock Band VR improvisational guitar experience is something a lot of non-musicians have never experienced before – both from an accessibility standpoint, but also simply the mass appeal of being able to shred on guitar in front of hundreds of people without having to practice a lick. For anyone that is curious or interested in learning, we can bring them these brand-new experiences utilizing these immersive new mediums. Outside of music as well! For example, take someone studying marine biology, developers can put them under the ocean experiencing the changing coral reefs right before their eyes. The sky’s the limit.