Researchers in the University of Rochester’s Department of Computer Science and the Department of Psychiatry at the University of Rochester Medical Center (URMC) recently developed new technology that can gauge a user’s mood by observing them through a webcam and assessing the tone of their social media output. In a recent paper titled “Tackling Mental Health by Integrating Unobtrusive Multimodal Sensing,” the researchers described how a computer can use facial recognition and data mining to interpret and predict human emotions. UR Professor of Computer Science Jiebo Luo presented the paper last week at a national conference of the Association for the Advancement of Artificial Intelligence in Austin, Texas.
The methods used by Luo and his colleagues are on the cutting edge of research into human-computer interaction and artificial intelligence. Computer vision is a major research interest in human-computer interaction; devices as commonplace as digital cameras use facial recognition techniques to locate the faces in a picture in order to optimize the exposure.
Luo’s computer program locates the forehead of a user in order to track the rest of the face. Once this is established, a range of signals can be recorded, including pupil dilation, rate of blinking and facial expressions such as smiles or frowns. This data can be interpreted to accurately determine a user’s mood.
Outside of facial cues, the computers can use other physical data to diagnose mood. Luo explained how the program uses tiny color changes in the forehead to determine a user’s heart rate.
“We take a lot of measurements from the forehead and the cheek, and then we average that to get the heart rate,” Luo said of the technique, “in practice, we can get it to within plus or minus five counts.” This method for diagnosing heart rate via webcam was discovered by other researchers; however, users had to stay perfectly still. Luo and his colleagues were able to develop the technology so it could track users as they move.
The other side of Luo’s program relies on data science: interpreting a user’s social media posts and even keystrokes to help determine how the person is feeling.
“To do that, we actually do something that other researchers haven’t done.” Luo said, explaining that “they only look at the text information.” He noted that social media posts are often so short, and contain so many acronyms and typos, that even the most advanced text processing algorithms have difficulty interpreting them. In their approach, Luo and his colleagues have analyzed the content of images attached to a user’s posts, which can give another dimension to analysis of sentiment.
An additional author of the paper was Dr. Vincent Silenzio, an associate professor in the Department of Psychiatry at the URMC. Luo noted that the new technology could be useful in hospitals and clinics, where patients’ moods could be monitored by a camera and a computer, giving doctors better information on which patients were in need of the most urgent attention.
The researchers have plans for the technology to be released as an app for smartphones and personal computers, although Luo noted that an app would likely take one to two more years to appear on the market. A commercially available app could keep tabs on a users’ moods–“a self-awareness that they normally don’t have,” Luo said.
Passanisi is a member of
the class of 2017.