pub:research

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Next revisionBoth sides next revision
pub:research [2021/11/16 17:32] – MRC 2021 papers kkuttpub:research [2022/05/25 10:14] – AfCAI2022 paper published kkutt
Line 2: Line 2:
  
 ===== Papers ===== ===== Papers =====
 +
 +=== AfCAI2022 ===
 +  * K. Kutt, P. Sobczyk, and G. J. Nalepa, "**Evaluation of Selected APIs for Emotion Recognition from Facial Expressions**," in Bio-inspired Systems and Applications: from Robotics to Ambient Intelligence. IWINAC 2022 Proceedings, Part II, 2022, pp. 65–74.
 +  * {{ :pub:kkt2022afcai.pdf |Full text draft available}}
 +  * ++Abstract | Facial expressions convey the vast majority of the emotional information contained in social utterances. From the point of view of affective intelligent systems, it is therefore important to develop appropriate emotion recognition models based on facial images. As a result of the high interest of the research and industrial community in this problem, many ready-to-use tools are being developed, which can be used via suitable web APIs. In this paper, two of the most popular APIs were tested: Microsoft Face API and Kairos Emotion Analysis API. The evaluation was performed on images representing 8 emotions—anger, contempt, disgust, fear, joy, sadness, surprise and neutral—distributed in 4 benchmark datasets: Cohn-Kanade (CK), Extended Cohn-Kanade (CK+), Amsterdam Dynamic Facial Expression Set (ADFES) and Radboud Faces Database (RaFD). The results indicated a significant advantage of the Microsoft API in the accuracy of emotion recognition both in photos taken en face and at a 45∘ angle. Microsoft’s API also has an advantage in the larger number of recognised emotions: contempt and neutral are also included.++
  
 === MRC2021b === === MRC2021b ===
Line 139: Line 144:
   * ++Abstract | We are aiming at developing a technology to detect, identify and interpret human emotional states. We believe, that it can be provided based on the integration of context-aware systems and affective computing paradigms. We are planning to identify and characterize affective context data, and provide knowledge-based models to identify and interpret affects based on this data. A working name for this technology is simply AfCAI: Affective Computing with Context Awareness for Ambient Intelligence.++   * ++Abstract | We are aiming at developing a technology to detect, identify and interpret human emotional states. We believe, that it can be provided based on the integration of context-aware systems and affective computing paradigms. We are planning to identify and characterize affective context data, and provide knowledge-based models to identify and interpret affects based on this data. A working name for this technology is simply AfCAI: Affective Computing with Context Awareness for Ambient Intelligence.++
  
 +===== Projects =====
 +
 +  * **Personality, Affective Context and the Brain (PANBA)** (01.2021-05.2022; research minigrant in the [[https://id.uj.edu.pl/en_GB/digiworld|DigiWorld Priority Research Area UJ]], project no. U1U/P06/NO/02.02; leader: [[pub:kkt|Krzysztof Kutt]]) aims to continue the efforts made in [[pub:biraffe|BIRAFFE1 and BIRAFFE2 oriented towards developing methods for affective personalization of intelligent systems]]. The project is aimed at analyzing data from the BIRAFFE2 experiment and preparing a new research procedure (BIRAFFE3) that includes the use of EEG. For more details, see [[https://geist.re/pub:projects:panba:start|the dedicated page in GEIST.re wiki]].
  
 ===== Tools and Datasets ===== ===== Tools and Datasets =====
  • pub/research.txt
  • Last modified: 2024/01/25 12:26
  • by kkutt