Four Adobe Researchers gathered recently to dig into the latest developments in digital humans, the data-based models that allow researchers to translate things humans do in the real world — such as body movements and facial expressions — into the digital world.
Kim Pimmel, a lead research designer who focuses on new technologies that empower creative people, led the roundtable of experts. He was joined by Research Manager Duygu Ceylan, Senior Research Scientist Jimei Yang, and Senior Research Engineer Jun Saito.
The team shared their insights on some of the biggest research questions behind digital humans, how the technology powers new human-computer interactions, and the future of creative storytelling tools.
A closer look at digital humans
Humans are central to so many of the stories people want to tell, which is why researchers are so interested in digital humans. Digital humans are models that understand things people do, and they’re what powers the latest tools for designing and editing images, videos, and animations of humans.
“Being able to understand humans is the key to giving people tools to create their stories as effectively as possible,” explains Ceylan.
One use of digital humans is to enable easier, more intuitive editing of existing images and videos. As Yang explains, “When our users capture humans and shoot photos of humans, the conditions or environment may not be perfect, so there’s a strong desire to edit those images — for example, to change the hairstyle, change the color of the photo… change the lighting conditions, change the expression of the face… so the user can create a better story.”
But digital humans aren’t just useful for realistic depictions — they’re also behind some of the most exciting new ways to create and animate fictional characters. And this raises fascinating technical and philosophical questions.
“It’s not necessarily about realistic representation,” says Saito. “It’s more about…what we perceive as human elements. The humans we see in cartoons are not necessarily realistic, but we recognize them as humans for some reason,” says Saito. “The study of this is really about understanding what we perceive as human and human-related attributes.”
What can digital humans help us do?
Digital humans are built into some of the latest tools for editing and animating humans and other characters in Adobe Photoshop, Lightroom, Aero, and Character Animator.
For example, in Character Animator, researchers used digital human models to build body and face tracking (where characters mimic the movements of the humans who create them) and audio-driven animation. “This is a new medium for driving animation instead of using a mouse and keyboard for key framing,” explains Saito. It can take out some of the time-consuming manual work. This approach could also allow non-technical people to express their ideas in new ways.
As researchers develop a wider range of human-computer interfaces for creating, editing, and animating characters, Ceylan is excited to see what people will be able to create.
“We see more and more cases where we are able to control models with multi-modal input. It can be text, it can be audio, it can be images, sketches, or other 3D cues,” she says. “I believe we will see examples of using all these multimodal inputs to create digital humans, not only to control how they look, but potentially control how they talk, how they move, how they reach physical constraints. So I think it’s very exciting to bring that capability to creative people.”
Check out the Adobe Research Roundtable
Want to know more about the research behind digital humans? Listen in as Pimmel, Ceylan, Yang, and Saito discuss what it’s like to collaborate with the creative community to build new tools, how they approach ethical questions around assembling datasets and creating AI and generative models, and the future of digital human-based creative tools.