Researchers are
getting closer to building machines that can interpret not just the words
people say, but also the emotion that informs their meaning. It is the
beginning of the telemedicine revolution, not just in terms of its use in
remote and developing countries but also in terms of enabling specialists to
consult more effectively in the developed world. It’s quite feasible that
multimodal machine learning in the future could be used to help assess
disorders and even diseases remotely, perhaps even in pandemic situations such
as the recent Ebola outbreak.
Clinicians have
been assessing patient’s nonverbal behavior subjectively, but now we are
offering ways to do it objectively. Algorithms are developed that recognize
communications such as facial expressions, posture, gestures and what is called
paralanguage with a high degree of accuracy. In the past five to 10 years we’ve
gone from talking about the concept to actually showing concrete examples. That
was the hard part, now we have the attention of the medical community and are
taking the next baby steps towards actually applying the science to mental
health treatment and assessment.
More
information: