Jennifer Healey is currently looking into how people can best interact with new generative technologies for both language and images using reinforcement learning with human feedback (RLHF). She has a long history of looking into how people interact with technology and envisioning the new experiences that these enables. She holds BS, MS and PhD degrees from MIT in electrical engineering and computer science. During here graduate studies at the Media Lab, she pioneered the field of “Affective Computing” with Rosalind Picard and developed the first wearable computer with physiological sensors and a video camera that allowed the wearer to track their daily activities and how record how they felt while doing them. She worked at both IBM Zurich and IBM TJ Watson on AI for smart phones with a multi-modal user interface that allowed the user to switch from voice to visual (input and output) seamlessly. She has been an Instructor in Translational Medicine at Harvard Medical School and Beth Israel Deaconess Medical Center, where she worked on new algorithms to predict cardiac health from mobile sensors. She continued working in Digital Health at both HP and Intel where she helped develop the Shimmer sensing platform and the Intel Health Guide. Her research at Intel extended to sensing people in cars and cooperative autonomous driving (see her TED talk).