
Adobe Researchers have been honored with a Technology and Engineering Emmy for their groundbreaking work on virtual reality (VR) editing tools. Awarded by the National Academy of Television Arts & Sciences, the Emmy celebrates Adobe’s “significant contributions to the development of 360-degree consumer video capture, editing, and presentation technologies.”
Adobe Research’s Principal Scientist Stephen DiVerdi, Senior Research Scientist Cuong Nguyen, and Senior Experience Designer Seth Walker joined several Adobe colleagues to accept the award. Principal Scientist Aaron Hertzmann also contributed.
Adobe’s breakthrough tools for VR editing
Beginning a decade ago, Adobe Researchers pioneered tools that help VR creators manage the unique challenges of editing a spherical VR video on a flat screen.
Inside Adobe Premiere, Adobe Researchers helped create Adobe Immersive Environment, a feature that let users view their video, along with the Premiere timeline, inside a VR headset while using the desktop app. The feature was also added to After Effects with Theater Mode, allowing users to see their content in VR as if it were being projected onto a theater screen.
Building new tools in the early days of VR
Back in 2015, VR was just becoming commercially available and VR filmmakers were stepping in with new ideas. And many of them were editing their work in Adobe Premiere—with tools that weren’t yet tailored to a 360-degree world.
“When you look at VR on a flat screen, it’s just a large rectangular video that’s heavily distorted because it’s a spherical projection. You can put that in Premiere and trim it. You can even put some text in the center and it’ll look mostly right as long as it’s not too big,” explains DiVerdi. “But as soon as you want to do anything, like rotate the world a little bit or align the camera or put the text somewhere other than the center, none of the tools work anymore. So it was clear, as we talked with VR filmmakers, that they had a lot of editing needs that were not being met.”
The team’s first move was to simplify the process by allowing editors to edit directly in the VR headset. Nguyen, who had just joined Adobe Research as an intern working with DiVerdi and Hertzmann, jumped right in. By the end of his internship, he had created sophisticated proof-of-concept app that ran in a headset, providing a timeline where users could trim clips, add music, and do other VR-specific editing tasks.
Based on that early work, DiVerdi presented Project Clover, a collection of dedicated tools for viewing and editing directly in VR, as an experimental Sneak at Adobe MAX in 2016. “The VR video users loved it because the pain points around moving in and out of headsets while trying to edit on a flat screen were really significant for them,” DiVerdi remembers.
Adobe Researchers presented their work in a paper at the 2017 ACM CHI Conference (authors: Nguyen, DiVerdi, Hertzmann, and Feng Liu). In the same year, Adobe acquired Mettle, a company that built VR-specific editing tools for Premiere and After Effects, adding even more VR power directly into Adobe products.
Then, later in 2017, the team’s work helped power the Adobe MAX Sneak Project SonicScape, a tool designed to help video editors see, and edit, where a sound is located in a VR video.
Looking ahead for VR
While VR continues to be a relatively niche space, DiVerdi and the team are excited about what may be ahead. “Adobe’s VR tools are used to do things that, I think, are just frankly amazing,” says DiVerdi. “I would really like it if my entire computing environment could eventually be in VR. That was one of my first projects in grad school—to put on a headset and have all your apps and everything laid out so you could interact with them. I want to see the technology continue to develop because I think there are so many awesome things we can do with it.”
Wondering what else is happening inside Adobe Research? Check out our latest news here.