New Adobe Video Auto-Tagger Taps into Machine Learning

Adobe Research src=

A new Adobe video auto-tagger—an effort across Adobe Research, Sensei & Search, Stock, and Livefyre—does the tagging work for you. The feature has now shipped in Livefyre and has garnered online media attention.

The Livefyre feature, known as Smart Tags for Video, is part of Adobe Experience Manager (AEM). It’s capable of automatically generating tags for the hundreds of thousands of user-generated video clips passing through the Livefyre system each month.

User-generated video is an enormous source of content on the web: Last year, YouTube users alone uploaded 300 hours of video per minute. Marketers are eager to tap into this vast pool and understand its contents better.

Adobe’s Smart Tags for Video produces two sets of tags per clip. One describes around 150,000 classes of objects, scenes, and attributes, and the second corresponds to actions such as running, surfing, or eating. The machine learning network behind the tool was trained on images and videos from Adobe Stock.

To give this technology context, Santiago Pombo (Senior Product Manager, Livefyre) wrote a blog post about it on Medium.

Check out this VentureBeat story to learn more.

 

Contributors:

Bryan Russell, Josef Sivic, Hailin Jin, Joon-Young Lee, Ruppesh Nalwaya, Markus Woodson, Zhe Lin, Vishy Swaminathan, Saayan Mitra, Jerry Hall, Pramod Srinivasan, Samarth Gulati, Dibyajyoti Ghosh, Sameep Solanki, Yifu Wang, Tim Converse, Brett Butterfield, Baldo Faieta, Alex Filipkowski, Kate Sousa, Bill Marino, Ludovic Levesque, Santiago Pombo

 

Story by Meredith Alexander Kunz

Recent Posts