I am a researcher at Imagination Lab, Adobe Research. My research interests include areas of NLP, Artificial Intelligence and Machine Learning.
On previous research, I focused on of language sequence modelling using Recurrent Neural Networks that incorporated hierarchical representations, gated attention, uncertainty propagation, stacked residual learning, context-dependent and structure-dependent models in order to improve the precision, efficiency and interpretability aspects of current RNN architectures. At the moment, I am particularly interested in efficient and accurate models for text processing in low-to-medium resource scenarios with applications to dialog systems and sequence modelling.
I am looking for interns working on Language Generations, Large-scale dynamic classification, and Dialog control. Information about the Adobe internship program can be found here.