The History of Virtual Reality
The History of Virtual Reality
Let’s take a look back at the history of virtual reality, and how this technology – which could
dramatically affect music-makers – has advanced over the last few decades.
The father of VR
The know-how to create stereoscopic – ie, three-dimensional – imagery has been around since the
advent of photography, as even then, it had long been known that our visual perception of depth
and distance is derived from differences in what each eye sees due to their separation – an effect
known as parallax. Early photography pioneers had recognised that a double-lens camera would
allow them to capture an image that would, if viewed such that each eye only saw the corresponding
side of the image, simulate the parallax effect and thereby allow an observer to perceive distance
and depth within a flat two-dimensional image.
Over the past 50 years, VR largely remained the preserve of research labs and the military. That
nearly changed in the 90s and early 2000s, as comparatively powerful computers found their way
into more and more homes, and films such as 1992’s The Lawnmower Man and 1999’s The Matrix
brought ideas of a virtual-reality experience to the attention of a much wider audience. But it wasn’t
to be: the technology of the day was still quite primitive, and consumer-grade processor and screen
technologies lacked the necessary power and miniaturisation. Public attention moved on and
mainstream, consumer VR was dead… or so it seemed.
“What is ‘real’?” asks Morpheus in the film The Matrix, shortly before revealing that humans are
little more than batteries powering an artificially intelligent robotic master species. “How do you
define ‘real’?” he continues. “If you’re talking about what you can feel, what you can smell, what you
can taste and see, then ‘real’ is simply electrical signals interpreted by your brain.” If you think this is
just an example of top-notch Hollywood scriptwriting then you’re in for a shock. It turns out that
Morpheus is dead right, and this realisation has startling implications for our assumptions about the
nature of reality itself.
A lot of research into how humans perceive their reality has focused on highlighting various optical
and sensory illusions because, by studying how such illusions confuse our perception, researchers
have been able to explore the limits of our internal model of reality and discover what happens
when those limits are stretched to breaking point.
This work has highlighted how far removed from reality our internal model can be, and reveals that
the mind draws many assumptions and inferences from the input it receives, but doesn’t always
come up with the right answers.
This, for example, means that what you see can influence the perceived sounds you hear, something
known as the McGurk effect. What all of these illusions demonstrate, as Michael Abrash, chief
scientist at Oculus explained at the Facebook Developer Conference, is that “given a particular set of
inputs, we have very little conscious choice about the reality we’ll experience”.
In March 2016, the Oculus Rift hit the shelves. This had started life in 2012 as a Kickstarter project
whose success caught many by surprise, raising $2.5 million – Oculus’ Michael Abrash explains: “[At
that time] VR had been dead and buried for more than a decade, and seemed no closer to broad
success in 2012 than it had 25 years earlier.” Despite this, by 2015, Oculus and its technology had
been acquired by Facebook for a whopping $2 billion, and Abrash found himself giving a keynote
speech at the company’s annual Developer Conference, an idea that, to him in 2012, “wouldn’t just
have been science fiction, it would have been pure fantasy”.
Hot on Oculus’ heels, the VIVE system hit the shelves in June of the same year, the result of a
collaboration between phone and display manufacturer HTC and video-game developer Valve, which
also owns and operates the popular Steam gaming service. Then a few months later, in October,
Sony joined the party with the PSVR for Playstation 4 consoles.
Moving right forward to today, then, 2019 promises to be a big year for VR. The second generation
of devices from Oculus and HTC/Valve – Rift S and VIVE Pro respectively – are hitting the shelves, as
are new systems from Microsoft and others. And, along with HMDs, work is accelerating on all sorts
of other hardware aimed at making virtual worlds ever-more convincing: gloves, gauntlets and suits
containing motors and actuators that can create sensations of touch, weight and inertia; bi-
directional treadmill controllers that allow you to walk about in a VR world while staying in the same
spot in the real one; eye-tracking systems that know where you are looking in the virtual world and
then dynamically modify the view in order to make it more like how our eyes actually see, thereby
increasing realism and reducing the risk of VR-induced motion sickness. If you can imagine it, the
chances are somebody’s already developing it.
It won’t be long, then, before we can routinely ride onboard Lewis Hamilton’s F1 car or stand next to
Dave Grohl while he performs to an arena of fans… and, indeed, if you are Dave Grohl, it won’t be
long before you can perform to a hundred arenas-full of fans while hanging out in your studio.
Loki Davison of Chroma Coda, developers of the immersive VR application The Music Room, also tells
us of another practical benefit that you might not have considered: “For me, VR is very useful for
prototyping the physical stuff I build. It’s great to model a sheet-metal part, put on the headset and
hold it in my hand and get an idea if it’s the right size and shape for the interaction you want.”
Room-scale VR
The Oculus Rift and HTC VIVE both support what’s known as ‘room scale’ VR. This allows the user to
move around within a predefined space, and have that movement mapped accurately into the
virtual world by sensors located in the room. This leads to more immersive experiences, and can go a
long way to alleviating VR-induced motion sickness because visual and balance senses aren’t
receiving mismatched inputs.
It can also lead to many comedy moments as you trip over the cat as it meanders past, or walk into a
wall, although the borders of the ‘play area’ are mapped into the virtual world, allowing a grid or
similar barrier to be displayed when you get too close to those borders. Having cables trailing behind
you can be a nuisance, but wireless HMDs and wireless adapters for wired HMDs are now available.
Augmented reality
Augmented reality, or AR – sometimes referred to as ‘mixed reality’ (MR / XR) – mixes imagery from
the real world with computer-generated graphics, although is not necessarily stereoscopic in nature.
Many mobile phone and Nintendo DS games use the technology to overlay game graphics on the
image being picked up by the device’s camera; Pokémon GO is a popular example of this sort of
game.
However, AR has many serious applications, too, for example allowing surgeons to see a patient’s
blood vessels and organs before lifting the scalpel, or helping search-and-rescue teams map infra-red
and other data sources onto their view of a search or disaster zone. Many see AR as having even
more potential than VR, and as a studio tool it offers load of possibilities: imagine having virtual
synth and effect control panels hanging in virtual space around your real-world DAW and
instruments!