The research behind Harmonize, a new generative compositing tool that blends objects into any background 

October 31, 2025

Tags: AI & Machine Learning, Computer Vision, Imaging & Video

The work of Adobe Researchers, in close collaboration with our product team partners, is behind Harmonize, a new, AI-powered compositing tool inside Photoshop. Harmonize automatically blends an object into another image by removing the object’s original background and adjusting its color, lighting, and shadows to match the new scene. 

By automating some of the most complex and tedious steps of image compositing, Harmonize—which is now in Photoshop—allows professionals to move more quickly toward refining their images with the other tools inside Photoshop. For novices, Harmonize makes it possible to create high-quality composite images without mastering all of the technical steps. 

The world got its first glimpse of the technology behind Harmonize back in 2024 when Research Scientist Mengwei Ren presented it as the MAX Sneak Project Perfect Blend. At the time, Adobe Researchers, including Ren and senior Research Scientist and Engineer He Zhang, had already been working on the technology for several years.

How Adobe Researchers created the technology behind Harmonize 

In 2021, Zhang and his team introduced a neural filter in Photoshop that could change the global color and saturation of a foreground image to match a new background. By 2023, they were building on this work, collecting datasets and training models for AI-powered relighting. That year, Ren joined the team as an intern, with a project designed to bring relighting into the larger context of harmonizing a composited portrait – something that had never been done before. 

One of the biggest challenges, Zhang remembers, was getting a high-quality dataset to train the model. “Relighting methods often require HDR (high dynamic range) panorama environment maps, but general users usually capture LDR (low dynamic range) images—so there’s just not a lot of HDR information. One of Mengwei’s big accomplishments was to experiment with using LDR background images, paired with a powerful diffusion model, to do portrait relighting. The paper we published at CVPR 2024 was, I believe, the first in the industry to explore this method.” 

After a successful research internship, Ren completed her PhD and returned to Adobe to dive deeper into the project. “First, we expanded the scope beyond portraits to full-body images and non-human subjects. Then we brought in visual effects including shadow and reflection generation,” explains Ren. “We wanted to build the first general purpose, unified model for all of the main functions of harmonization: foreground relighting, background visual effects generation, and boundary blending.” 

The research process involved countless rounds of iteration. For each one, the team gathered new images to train the model, reviewed the results, and pinpointed edge cases that didn’t quite work yet. For example, in early models, the reflections didn’t look realistic enough, so the team gathered more data with reflections to refine the model.  

Throughout the process, the team relied on their own observations, and they invited users into a private beta to give feedback. But evaluating a creative tool like Harmonize can be tricky. “Harmonization is subjective, which means that evaluating it is really about eyeballing the results. There aren’t any strict quantitative metrics for us to look at, so it always involves getting lots of opinions so we can constantly improve the model,” says Ren.

Going from a MAX Sneak to a Photoshop feature 

In 2024, the team’s work was selected for an Adobe MAX Sneak. The team adopted the working name Project Perfect Blend, built a demo that showcased what the technology could do, ran through the demo hundreds of times, and rehearsed for weeks to refine a storyline that was relatable for the MAX audience.

“It was so rewarding to present to people who use Adobe products every day—and so much different than the kinds of presentations researchers usually give to each other. The audience was excited, and we got such positive feedback. It helped us think even more carefully about user needs—and it made us want to deliver the technology as soon as possible,” says Ren.

After MAX, the Research team began collaborating with Adobe’s services team to build an API on top of the model. They also worked with the Photoshop team to tackle the technical hurdles of implementing Harmonize inside Photoshop. The teams met regularly to share progress and blockers, and researchers jumped in to debug as issues came up.  

Since the Harmonize beta launched, user responses have been positive, as has coverage in places like TechCrunch, and CNET. Now the team is excited about what’s next, including future refinements that will give users more precise control of things like lighting and shadow intensity.

“Our goal is always to help users with their creative projects, and we’re excited to keep taking that further,” says Ren.  

Wondering what else is happening inside Adobe Research? Check out our latest news here.

Recent Posts