Knowledge is power. You can't begin a career, for that matter even a relationship, unless you know everything there is to know about it. (c) Randeep Hooda
Friday, September 16, 2016
The machine have learned to predict the emotions MRI
Neuroscientists at Duke University have developed a technology mapping imaginary emotional states with the help of magnetic resonance imaging (MRI). The results are presented in the journal PLOS Biology.
The study is based on the work carried out by scientists in 2015. Participation in the experiment took 32 volunteers. The subjects were placed in an MRI scanner and showed them two music videos and two segments of movies aimed at provoking the seven emotions: fear, sadness, surprise, anger, joy, satisfaction and indifference. After that, participants completed self-assessment questionnaire state. This allowed the authors to identify brain regions whose activity is characteristic of these experiences.
In the new study, the data were analyzed by machine learning algorithm. The work program has learned to recognize the specificity of the emotional states of the individual subjects and incorporate it into the rest of the samples, regardless of the functional brain differences and level of motivation. researchers then repeated the experiment with a new design. In the first stage examination held 21 people, with incentives to provoke emotions they will be charged.
Verbal self-state actors performed with a frequency of 30 seconds, every two seconds scanner recorded hemodynamic features. The aim of the authors was to check whether to play in the brain emotional patterns, recorded earlier, in the absence of incentives, and whether it is possible to recognize them by the new algorithm. Each image was compared with the completed "emotion card". The results showed that the algorithm is able to successfully predict the emotional state for ten seconds before the self-assessment.
In the second phase of the study took part in the experiment, 499 people. Each of them was in the tomograph for nine minutes. The analysis was conducted without a stimulus, self-assessment was carried out after the fact, and was designed to identify depressive symptoms and signs of anger. According to the results of tests, scientists have confirmed the efficiency of the developed algorithm, despite an increase in the time interval between the fact of registration of hemodynamics and oral feedback.
According to researchers, the technology may be used, including, for helping people with alexithymia, who experience difficulties with comprehension and verbalization of their own emotions. In addition, the results could clarify the mechanisms of emotional states on a physiological level.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment