Guest Post By 2 Time Grammy Award Nominee, Morgan Page - This Article Originally Appeared on Morgan Page's Blog
After spending two weeks with the Oculus Rift and HTC Vive headsets, it’s clear that virtual reality will completely reshape the human experience. These devices transport you to another world. They will usher in an alternate reality, adding a new perspective to everything in modern life, and a physical interaction to every experience. Gaming will lead the charge, as well as cinematic narratives, but what about music?
Digital Audio Workstations (DAWs) have been around for over 20 years, but even with the upgrades to 64bit software, better interfaces, improved performance and processing, and higher resolution displays – they are still fairly flat and linear, no matter how many drop shadows and embossed buttons are added.
We’ve put instruments in a flat box, replicating the best software versions of physical instruments – but there were trade-offs. Even if you perfectly matched the sound quality, you lose the actual human contact: the vibration of the guitar’s body against your chest, the resonance of a piano’s soundboard, and the weight of the keys in your fingers. These elements change the expressiveness of instruments, and steer your ideas in different directions. Beyond the haptics of the instrument, the workflow also changed. It became easier to save presets and copy existing sounds, without having to patch sounds together with cables, and much easier to play instruments that took years to master. Companies like Artiphon even advocate that “mastery is dead.”
Now we are faced with a new platform that could change everything - how we consume, create, perform, share, collaborate, learn, and teach music. Let’s look at some ideas and predictions for the years to come, and how it might influence your music.
CONSUMING MUSIC IN VIRTUAL REALITY
Last year I worked with developer 3D Live to create the first ever VR lyric video experience for Oculus DK2. We unveiled it at a party for the GRAMMYs, and brought it to Lollapalooza and SXSW. It was a natural evolution from the MPP3D tour, where we brought a semi truck’s worth of 3D LED screens to over 35 sold out shows nationwide.
The point of the experience was that you weren’t just listening to a stream, watching a music video, or panning around a 360 image – you were inside the experience. It combined elements of movies, gaming, and music videos – but was distinctly different. We built it with a video game engine called Unreal normally used for first person shooters. Both the MPP3D and Oculus experience changed how I approach the music creation process.
From 3D Live on Vimeo
Consumption in VR will center around concert video streams like Jaunt VR, rendered avatar performances likeThe Wave, artistic tools like Google’s Tiltbrush and Oculus Medium, social metaverse experiences like Project Sansar, interactive music videos, interactive movies, and audio games like Audioshield and Rockband. Audioshield even lets you stream Soundcloud songs or scan your own music library, then dynamically builds a level based on the audio.
All these experiences will range from somewhat passive, to highly interactive – giving youagency, personalization, involvement, and unheard of access to rare, expensive, and dangerous experiences only possible through an immersive medium like VR. It’s unlikely that VR versions will replace the traditional concert experience, it will just expand the audience and reach.
These richer media experiences will stick with us, providing spatial bookmarks in our mind. Done right, they will add value to a music world that is increasingly becoming a commodity. How do you add value to music? You create a more engaging experience.
The most crucial aspect will be getting the social component right. VR with friends is like VR squared – it completely changes the experience from isolated and self-reflective, to shared and integrated. Demos like Oculus Toybox blew my mind, where you play ping pong with another person in VR and toss around objects in one virtual room. The new platform has to create a significantly better experience for all avenues of entertainment, but it typically needs to be built from the ground up for this new medium, not ported from an existing product. You can’t just add cut shots, zooms, and pans because it works in movies. VR is already compelling for great experiences – but what about creating music, and what are the limitations?
Get great content like this and exclusive OSSIC X 3D Audio headphone updates delivered straight to your inbox.
CREATING MUSIC IN VIRTUAL REALITY
In it’s current state, virtual reality headsets are fairly bulky and cumbersome – requiring several sensors for room scale VR and an umbilical-like tether to your computer. Headsets can get greasy or steamed up as your body heats up from physical activity, and the main cable can quickly get tangled around your body. With Samsung Gear VR your battery runs out of juice fairly quickly, and the phone can quickly overheat. With all it’s bulk, do you really want to spend 12 hour marathon studio sessions in VR? Not in it’s current state. This is a temporary solution, and 5 years from now – there will be a much lighter solution, similar to the lightweight Magic Leap glasses. Cutting edge VR requires serious computer power, so you’ll need to invest in a new desktop machine or significant upgrade to meet the recommended specs.
The most striking thing I’ve noticed is that after spending a few weeks in VR and returning to my studio, my DAW felt flat and uninspiring. Ableton’s spartan-like interface felt like reading the newspaper, not making music. But the most important aspect is the input device. The real game-changer with VR are the hand controllers. Instead of using key commands and mouse clicks, you use your hands. Reach for a virtual backpack behind your shoulder, grab a sword, or combine your dominant and sub-dominant hand for combination moves like firing a bow and arrow. This eliminates the need to remember button placements for games, and will likely remove the arcane key command combos you need to memorize in modern DAWs for basic functions like consolidating a region or exporting files. It’ll open up universal body gestures that could translate to multiple DAWs, so you don’t need to remember the different key commands unique to each DAW.
Novelty alone won’t be good enough. Creating music in VR will need to have significant and distinct advantages to conventional music making. This new world of creating music in VR will be more physical and feel more like play, which is great for your health (better blood flow) and fostering “beginner’s mind” to get past creative roadblocks. You’ll be making music inside a sphere instead of a flat static horizon line. Experiences like Lyra and Polydome allow you to physically string together sparkling geometric notes into elaborate and unexpected arpeggios
The Wave allows you to craft melodies with blocks, and dynamically perform music along with reactive visuals to a real audience in VR that dance and fist pump to their heart’s delight. Soundstage allows you to play virtual drums, trigger samplers, play a floating keyboard, and string together modules with virtual patch cables.
Elaborate surround panning will become much simpler with VR authoring tools, and it’s likely that even simple stereo mixes will be easier with virtual object panning. Muscle memory will provide an advantage, as you can remember where instruments and plugins are located spatially in your virtual studio, more intuitively organized than a file folder listed alphabetically.
None of these tools in VR will matter beyond basic entertainment if real world import and export functions aren’t integrated. I want to be able to bring in existing projects and use VR for what it does best – provide visual emphasis and dimension, allow physical interaction and intuitive hand gestures, and embracenon-linear thinking, We need real formats beyond the basic WAV/AIFF and MIDI standards – like integration ofNative Instruments STEMS, Pioneer Rekordbox libraries, Ambisonics, and even Max 4 Live devices.
At its best, VR will allow us to regain the physical nature of music that’s been lost in a flat world of keyboards and mice. As Brian Eno said in his Wired interview with Kevin Kelly “The problem with computers is that there is not enough Africa in them… it uses so little of my body…You’ve got this stupid little mouse that requires one hand, and your eyes. That’s it. What about the rest of you?”
PERFORMING MUSIC IN VIRTUAL REALITY
The physical nature of VR makes everything a performance. On the video side I haven’t found it to be nearly as compelling as rendered content – so it’ll likely rely on a hybrid in the future. But the most compelling case for performance in VR is The Wave, which allows you to DJ in a virtual booth and teleport back and forth from the audience, trigger visuals and FX, and watch as virtual crowd members move and react to your music.
It’ll be a great way to expand your audience, test out new material for special fans, and try out new visuals. The social aspect will be huge, as experiencing VR with friends is even more fun. Eventually merch integration will happen with virtual concerts, and Bitcoin payment seems inevitable. Would you walk up to a virtual merch booth to buy a shirt you’d wear in real life, or just buy a digital version for your avatar?
COLLABORATING AND SHARING IN VIRTUAL REALITY
Some of the biggest advantages of VR will revolve around collaboration. Rather than face timing and Splicing with collaborators across the globe, you could be next to them in the same virtual studio – hearing the same mixing environment acoustics, with access to the same DAW. Session players could track their parts in real time. The main benefit would be less back and forth communication, and better creative flow. Being in the actual room with real humans is always going to be better, but for remote situations this brings real advantages and better opportunities for learning.
The sharing aspect will take some time for broadband to catch up, but expect Snapchat VR and Facebook Live sessions, Twitch broadcasts, stem exports, open source Splice sessions, and all sorts of streaming integration with Spotify, Soundcloud, and Apple Music that will create exposure and open up revenue streams for musicians.
LEARNING AND TEACHING IN VIRTUAL REALITY
Thanks to endless YouTube tutorials, it’s become much easier to learn audio production, but imagine being mentored by someone in a virtual session as they show proper chord voicings for a guitar riff – lessons that benefit from physical demonstrations and real time learning.
Early studies are showing that learning and retention improves dramatically in a virtual space because it combines the senses and provides a much more immersive visual and physical classroom for learning. VR will enable students in remote locations to connect to opportunities around the world, as long as they have a decent internet connection and proper hardware.
In it’s current state, VR presents a ton of exciting areas for music, and a handful of challenges. It’s expensive to develop content, and expensive to buy the setup (average setup is $2200 w/ headset, controllers, and computer), with limited reach – but the technology is already here, and it’s ready for you – but you really have to experience it for yourself. These video clips don’t do it justice. How will VR integrate into your music making process? Comment below and let me know!
- Morgan Page
This Article Originally Appeared on Morgan Page's Blog