In 1965, Ray Dolby founded the audio-enhancing technology company that would bear his name. Over five decades later, while that remains its primary domain, Dolby has branched out into video – at the cinema with projection systems – and even virtual reality. But it’s doing more.
Past the doors of its San Francisco headquarters – where Gadgets 360 recently spent a day learning about how Dolby worked with Netflix to bring high-dynamic range (HDR) imagery to shows such as Marvel’s Iron Fist – the company is conducting advanced research on how audio and video affect the human body. In short, Dolby is studying how we react to what we watch.
Inside a darkened, sound-proofed living room – which Dolby has dubbed the ‘Biophysical Lab’ – the company’s team of scientists use a variety of biosensors, including a 64-channel EEG cap to monitor brain activity, a pulse oximeter for heart-rate, a wrist sensor that measured galvanic skin response – sweat glands and nervous system – and a thermal imaging camera to track responses to emotional stimuli.
The day we visited, the stimuli was a short clip from Guillermo del Toro’s Pacific Rim, and the person being monitored was a Dolby employee, who sat on a leather couch, surrounded by three displays and over a dozen speakers spread across the room. While the display in the middle screened the film, the one on the left showed a thermal image from the camera pointed at her, and the one on the right displayed real-time data coming from the sensors.
While this might seem far from Dolby’s mission to improve audio technology, the company’s chief scientist, Poppy Crum, told us that the late founder always spoke about how integrating “human perception and the human experience into the technologies we know” was at the core of Dolby from the start.
She noted that the data gathered has been useful to inform the decisions Dolby makes, in very different ways. For example, Crum and her team have been working on novel ways of signal monitoring to capture human response out of thermal camera. That’s because with thermal, “you can capture without having to wire up someone, which may feel like you’re very removed from a normal, immersive setting”.
And while EEG is great for certain things, “it might not be the most predictive, most robust, [and the] most consistent,” Crum explained. “Sometimes, simpler responses tell us a lot more about what is happening to the user, and their behaviour,” she added.
The hope is that the questions they ask can be used to inform content creation in the future. Crum gave the example of how there’s a spike in skin response before a penalty kick in a football match. It doesn’t even have to be a team you support, she added, it’s just human anticipation of the event. And when there’s an explosion in movies, they have noticed a corresponding effect on the user’s cheeks.
The tests inside the Biophysical Lab aren’t independent, but rather link into work from other experiments, such as its ‘Sensory Immersion Lab’. There, Dolby can simulate any audio environment after capturing it on location, be it a cave, a concert hall, or even say the Dolby Theatre in Hollywood (it’s where the Oscars are hosted). Dolby can reproduce it fully back in its office, and then measure the effect it has.
“That lab can talk to us up here, so we can do the behavioural studies, paired with the physiological studies,” Crum said. Dolby not only has a top team of neuroscientists, she added, but everyone in her team has a coding background. And for them, it’s about turning “biology into actionable algorithms”.
Right now, the data Dolby collects isn’t being used to inform how movies or TV shows should be created. But studies have shown that EEG data is more reliable than participants’ own statements when it comes to predicting the success of films. Naturally then, it’s only a matter of time before an algorithm decides how spontaneous and bright an explosion should be, to get that desired response.