The idea that wearables could act as an affective interfaces to help computers understand human emotion was formative in the early development of affective computing. Just as cameras and microphones provided signals that could be interpreted by computer vision and natural language processing algorithms to help computers understand what people saw and heard, so it was hoped that wearables with physiological sensors could provide the signals necessary for affective computing algorithms to enable computers to understand what people felt emotionally. It was imagined that these algorithms could enable wearables to become the next generation of truly "personal" computing, creating a holistic interface between human experience and the digital world.
Learn More