The Adobe Research team helped reveal the future of artificial intelligence and generative AI for the marketing world at this year’s Adobe Summit. The event — which returned to Las Vegas live this year — gave the global marketing community a preview of cutting-edge new tools that offer deeper insights into what customers want and need, make it easier to create marketing content, and enable personalization at scale.
The Adobe Research team helped introduce exciting new technologies and ideas that were featured in a keynote, two Sneaks, and several Summit sessions. Here’s a look at what they shared.
Summit Keynote: Generative AI finds objects from images — and then sends them on a virtual photoshoot
For this year’s Content Supply Chain Keynote Demo, Adobe Senior Technical Evangelist Ben Tepfer shared how Custom Diffusion for Enterprise incorporates Adobe’s new generative AI tool, Adobe Firefly, to generate a product using a few photos. Adobe Research is behind the work on Custom Diffusion.
In his demo, Tepfer used photos of a tent to teach Firefly how to generate a particular product, and then created a virtual photoshoot of the tent next to a remote mountain lake — all without ever leaving the keynote stage.
Presenter: Ben Tepfer
Adobe Research collaborators: Nick Kolkin, Richard Zhang, Wei-An Lin, Cynthia Lu, Eli Shechtman, and Jun-Yan Zhu
Additional collaborators: Nupur Kumari, Tomasz Opasinski, Brooke Hopper, Midhun Harikumar, Ajinkya Kale, Rachel Einstein-Sim, and Baldo Faieta
Adobe Research Summit Sneaks: Inclusive, personalized digital shopping and easier marketing videos
Each year, Adobe Summit features Sneaks — previews of brand new, in-development technology. This year, Adobe Research was behind two of the key Sneaks: a new app for personalized, inclusive shopping and a better tool for tailoring marketing videos to every audience.
Custom Clips
Custom Clips reduces the time and complexity of video editing so marketers can quickly create marketing videos for each of their important audiences and channels. Custom Clips starts by using Adobe Sensei AI to analyze past videos’ performance data. Then, it edits new videos automatically, taking the video’s audience and destination into account. For example, Custom Clips can turn a 60-second commercial into 15-second connected TV spots with content tailored to highly loyal customers, and then repeat the process to create a different set of videos for new customers.
Presenter: Stefano Petrangeli
Adobe Research collaborators: Haoliang Wang, Somdeb Sarkhel, Saayan Mitra, and Vishy Swaminathan
True Colors
True Colors is an AI-powered tool designed to boost inclusivity in digital shopping — and personalize the shopping experience for every shopper — by helping people find clothes and color palettes that are just right for them. First, True Colors uses Adobe Sensei AI to analyze a shopper’s photo for color undertones, facial features, and more, so it can match shoppers with complimentary colors. Then, True Colors filters website inventories and shows shoppers the products available in their best colors. This makes it quicker and easier for each person to discover apparel that compliments their unique look.
Presenter: Dej Mejia
Adobe Research collaborator: Jose Echevarria
Additional collaborators: Rob Burke, Cole Connelly, Jacob Hanson-Regalado, Michelle Lee, Ronald Oribio, Michele Saad, and Andrew Thomson
Summit Sessions from Adobe Research: Tools for better experimentation, smarter captions for charts and data, and more
At Adobe Summit Sessions, attendees boost their skills, discover powerful new features in Adobe tools, and catch up on the latest trends. Here are the Summit Sessions where Adobe Research’s work was spotlighted.
Experimentation Hub and Experimentation in Adobe Journey Optimizer
Attendees learned about Experimentation Hub, a tool that allows users to take advantage of the latest advancements in experimentation inside Adobe Journey Optimizer. Experimentation Hub uses Any Time Valid Confidence Sequences,a technology developed by Adobe Research, to power uncertainty estimation and stopping criterion for experiments. With this new capability, users can run their experiments as long as they need to and continuously monitor the results — all without compromising on statistical correctness.
Presenter: Jen Lasser
Adobe Research collaborators: Ritwik Sinha, David Arbour, Raghav Addanki, and Vishy Swaminathan
Additional collaborators: Akash Maharaj, Simon Liu, Moumita Sinha, Manas Garg, Tao Wang, Justin Grover, and Rachel Hanessian
Intelligent Captions
In this session, attendees got a look at Intelligent Captions, a new AI-based technology that understands charts and data, and then generates insights about them in plain language. The technology is based on Adobe’s own distilled version of a large language model, which was created to improve the quality of AI-generated writing. The new model debuted on the Summit Main Stage during this year’s opening keynotes, and the technology details were presented and demoed during a super session.
Presenters: Sanjay Vachani and Jennifer Werkmeister
Adobe Research collaborators: Eunyee Koh, Shunan Guo, Gromit Chan, Victor Bursztyn, and Shiv Saini
Additional collaborators: Prithvi Bhutani (PM), Wei Zhang, and Abhisek Trivedi
Data Distiller
Attendees learned how to use new tools that help separate the signal from the noise in business data, giving marketers quicker, better insights for customer engagement, segmentation, and analysis. The session focused on getting more value from Adobe Real-Time Customer Data Platform datasets, including powerful data transformations for Adobe Experience Platform applications, such as Customer Journey Analytics, Adobe Journey Optimizer, and even non-Adobe destinations.
Presenter: Annamalai Annamalai (PM)
Adobe Research collaborators: Eunyee Koh, Shunan Guo, Gromit Chan, and Shiv Saini
Additional collaborators: Saurabh Mahapatra (PM) and Vasanthi Holtcamp’s engineering team
Wondering what else happened at Adobe Summit this year? Get the details here.