80 datasets found
  1. D

    Facial Emotion Recognition Fer Market Report | Global Forecast From 2025 To...

    • dataintelo.com
    csv, pdf, pptx
    Updated Jan 7, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dataintelo (2025). Facial Emotion Recognition Fer Market Report | Global Forecast From 2025 To 2033 [Dataset]. https://dataintelo.com/report/facial-emotion-recognition-fer-market
    Explore at:
    pptx, csv, pdfAvailable download formats
    Dataset updated
    Jan 7, 2025
    Dataset authored and provided by
    Dataintelo
    License

    https://dataintelo.com/privacy-and-policyhttps://dataintelo.com/privacy-and-policy

    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    Facial Emotion Recognition (FER) Market Outlook



    The global facial emotion recognition (FER) market size was valued at approximately USD 4.2 billion in 2023 and is projected to reach around USD 12.5 billion by 2032, growing at a robust CAGR of 12.8% during the forecast period. The increasing adoption of artificial intelligence and machine learning technologies in various applications, including healthcare, retail, and security, is a major growth factor driving this market.



    A significant growth factor for the FER market is the rising demand for enhanced security systems. The integration of FER technology in surveillance systems has enabled more accurate monitoring and identification of individuals based on their emotional states. Security agencies and law enforcement bodies are increasingly deploying these systems to preemptively detect suspicious activities, thereby reducing potential threats. Additionally, the ongoing advancements in AI algorithms have significantly improved the accuracy and efficiency of FER systems, further boosting their adoption across various sectors.



    Another driver is the healthcare sector's growing interest in utilizing FER for patient care and emotional well-being. FER technology is being used to monitor patients' emotional states, providing insights that can improve mental health treatment and care plans. It is particularly beneficial in scenarios where patients may have difficulty communicating their feelings, such as in the case of certain neurological disorders. Moreover, the COVID-19 pandemic has accelerated the adoption of telemedicine, where FER can play a crucial role in remote patient monitoring, making the technology even more valuable in contemporary healthcare settings.



    The retail sector is also a pivotal growth area for the FER market. Retailers are increasingly leveraging FER technology to enhance customer experience and optimize sales strategies. By analyzing customers' facial expressions, retailers can gain insights into consumer preferences and behaviors, allowing them to personalize marketing efforts and improve product placements. This data-driven approach helps in increasing customer satisfaction and loyalty, thereby positively impacting the market growth.



    The role of Emotional intelligence in technology is becoming increasingly significant, especially in the realm of facial emotion recognition. As businesses and healthcare providers strive to understand and respond to human emotions, the integration of emotional intelligence into FER systems is proving invaluable. This integration allows systems to not only recognize facial expressions but also interpret the underlying emotional context, leading to more nuanced and effective interactions. By leveraging emotional insights, companies can enhance customer experiences, improve patient care, and even bolster security measures. The ability to accurately gauge emotions is transforming how industries engage with individuals, making emotional intelligence a cornerstone of modern FER technology.



    Regionally, North America holds the dominant share in the FER market, driven by substantial investments in AI technologies and the presence of major tech companies. The region's advanced technological infrastructure and high adoption rate of innovative solutions contribute significantly to market growth. Additionally, Europe and Asia Pacific are emerging as lucrative markets due to increasing investments in security and surveillance systems and growing awareness about the benefits of FER technology. The Asia Pacific region, in particular, is expected to witness the highest CAGR during the forecast period, fueled by rapid economic growth and technological advancements in countries like China and India.



    Component Analysis



    The FER market by component is segmented into software, hardware, and services. The software segment holds a significant share of the market due to the increasing demand for advanced AI algorithms and machine learning models that enhance FER capabilities. Software solutions are crucial for processing and analyzing facial expressions, and continuous advancements in AI technology are driving their adoption. Many companies are investing heavily in developing sophisticated FER software that can be integrated seamlessly into various applications, from security systems to customer service platforms.



    Hardware components, such as cameras and sensors, are indispensable for capturing facial data. The hardwa

  2. f

    Recognition rate of the proposed FER system using IMFDB dataset of facial...

    • plos.figshare.com
    xls
    Updated May 31, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Muhammad Hameed Siddiqi; Md. Golam Rabiul Alam; Choong Seon Hong; Adil Mehmood Khan; Hyunseung Choo (2023). Recognition rate of the proposed FER system using IMFDB dataset of facial expressions (Unit: %). [Dataset]. http://doi.org/10.1371/journal.pone.0162702.t005
    Explore at:
    xlsAvailable download formats
    Dataset updated
    May 31, 2023
    Dataset provided by
    PLOS ONE
    Authors
    Muhammad Hameed Siddiqi; Md. Golam Rabiul Alam; Choong Seon Hong; Adil Mehmood Khan; Hyunseung Choo
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Recognition rate of the proposed FER system using IMFDB dataset of facial expressions (Unit: %).

  3. F

    Facial Emotion Recognition (FER) Report

    • datainsightsmarket.com
    doc, pdf, ppt
    Updated Jun 10, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Data Insights Market (2025). Facial Emotion Recognition (FER) Report [Dataset]. https://www.datainsightsmarket.com/reports/facial-emotion-recognition-fer-1961815
    Explore at:
    pdf, ppt, docAvailable download formats
    Dataset updated
    Jun 10, 2025
    Dataset authored and provided by
    Data Insights Market
    License

    https://www.datainsightsmarket.com/privacy-policyhttps://www.datainsightsmarket.com/privacy-policy

    Time period covered
    2025 - 2033
    Area covered
    Global
    Variables measured
    Market Size
    Description

    The Facial Emotion Recognition (FER) market is experiencing robust growth, driven by increasing demand for advanced human-computer interaction and the proliferation of applications across various sectors. The market, estimated at $2 billion in 2025, is projected to achieve a Compound Annual Growth Rate (CAGR) of 20% from 2025 to 2033, reaching an estimated market value of $8 billion by 2033. This growth is fueled by several key drivers, including advancements in artificial intelligence (AI) and machine learning (ML) algorithms, the decreasing cost of high-resolution cameras and sensors, and rising adoption of FER technology in diverse fields such as automotive, healthcare, security, and marketing. The increasing availability of large, diverse datasets for training sophisticated FER models further contributes to market expansion. Trends include the development of more accurate and robust algorithms capable of handling variations in lighting, pose, and occlusion, as well as the integration of FER with other technologies like biometric authentication and virtual reality. However, challenges remain, including concerns regarding data privacy and bias in algorithms, the need for robust and reliable performance across diverse populations, and the high cost of implementation in some sectors. Despite these restraints, the market is expected to maintain its strong growth trajectory. Segmentation within the market includes software and hardware components, cloud-based and on-premise solutions, and various application verticals. Key players such as Pushpak AI, Cameralyze, and others are actively involved in developing and deploying innovative FER solutions, fostering competition and innovation. Regional variations in adoption rates are anticipated, with North America and Europe likely leading the market initially due to higher technological adoption and strong regulatory frameworks. However, the Asia-Pacific region is poised for significant growth in the coming years, driven by increasing smartphone penetration and expanding digital infrastructure. Overall, the future of the FER market appears bright, driven by continuous technological advancements, increasing demand, and the expanding application of this powerful technology.

  4. f

    Confusion matrix of the proposed FER system with HMM (as a recognition...

    • plos.figshare.com
    xls
    Updated Jun 4, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Muhammad Hameed Siddiqi; Md. Golam Rabiul Alam; Choong Seon Hong; Adil Mehmood Khan; Hyunseung Choo (2023). Confusion matrix of the proposed FER system with HMM (as a recognition model), instead of using the proposed recognition model (that is MEMM model) using USTC-NVIE dataset of facial expressions (Unit: %). [Dataset]. http://doi.org/10.1371/journal.pone.0162702.t016
    Explore at:
    xlsAvailable download formats
    Dataset updated
    Jun 4, 2023
    Dataset provided by
    PLOS ONE
    Authors
    Muhammad Hameed Siddiqi; Md. Golam Rabiul Alam; Choong Seon Hong; Adil Mehmood Khan; Hyunseung Choo
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Confusion matrix of the proposed FER system with HMM (as a recognition model), instead of using the proposed recognition model (that is MEMM model) using USTC-NVIE dataset of facial expressions (Unit: %).

  5. R

    Fer Dataset

    • universe.roboflow.com
    zip
    Updated Jul 31, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Facial Emotion Recognition (2024). Fer Dataset [Dataset]. https://universe.roboflow.com/facial-emotion-recognition-uyurx/fer-ghga7/dataset/1
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jul 31, 2024
    Dataset authored and provided by
    Facial Emotion Recognition
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Variables measured
    Sad Bounding Boxes
    Description

    FER

    ## Overview
    
    FER is a dataset for object detection tasks - it contains Sad annotations for 1,539 images.
    
    ## Getting Started
    
    You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
    
      ## License
    
      This dataset is available under the [CC BY 4.0 license](https://creativecommons.org/licenses/CC BY 4.0).
    
  6. r

    Data from: DEFAULTS Dataset

    • researchdata.edu.au
    • bridges.monash.edu
    Updated Mar 10, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Sanjeev Nahulanthran; Mor Vered; Leimin Tian; Dana Kulic (2025). DEFAULTS Dataset [Dataset]. http://doi.org/10.26180/28443197.V2
    Explore at:
    Dataset updated
    Mar 10, 2025
    Dataset provided by
    Monash University
    Authors
    Sanjeev Nahulanthran; Mor Vered; Leimin Tian; Dana Kulic
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Facial expression recognition (FER) has emerged as a promising approach to the development of emotion-aware intelligent agents and systems. However, key challenges remain in utilizing FER in real-world contexts, including ensuring user understanding and establishing a suitable level of user trust. We developed a novel explanation method utilizing Facial Action Units (FAUs) to explain the output of a FER model through both textual and visual modalities. We conducted an empirical user study evaluating user understanding and trust, comparing our approach to state-of-the-art eXplainable AI (XAI) methods. Our results indicate that visual AND textual as well as textual-only FAU-based explanations resulted in better user understanding of the FER model. We also show that all modalities of FAU-based methods improved appropriate trust of the users towards the FER model.

  7. F

    Facial Emotion Recognition (FER) Report

    • archivemarketresearch.com
    doc, pdf, ppt
    Updated May 27, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Archive Market Research (2025). Facial Emotion Recognition (FER) Report [Dataset]. https://www.archivemarketresearch.com/reports/facial-emotion-recognition-fer-556622
    Explore at:
    doc, ppt, pdfAvailable download formats
    Dataset updated
    May 27, 2025
    Dataset authored and provided by
    Archive Market Research
    License

    https://www.archivemarketresearch.com/privacy-policyhttps://www.archivemarketresearch.com/privacy-policy

    Time period covered
    2025 - 2033
    Area covered
    Global
    Variables measured
    Market Size
    Description

    The Facial Emotion Recognition (FER) market is experiencing robust growth, driven by increasing adoption across diverse sectors. While precise figures for market size and CAGR are not provided, considering the rapid advancements in AI and computer vision, a reasonable estimation places the 2025 market size at approximately $2.5 billion. This substantial value reflects the growing demand for emotion-aware applications in various industries, including marketing and advertising, healthcare, security, and human-computer interaction. The market is projected to exhibit a Compound Annual Growth Rate (CAGR) of around 20% from 2025 to 2033, indicating a significant expansion potential. This growth is fueled by several factors: the decreasing cost and improved accuracy of facial recognition technology, increasing availability of large datasets for training AI algorithms, and rising consumer acceptance of AI-powered solutions. Furthermore, the integration of FER technology with other emerging technologies, such as IoT and big data analytics, is further broadening its applications and accelerating market expansion. The competitive landscape is dynamic, with established players like Sony Depthsense and NEC Global alongside innovative startups such as Pushpak AI and Cameralyze. The market is characterized by continuous innovation, leading to the development of more sophisticated and accurate FER systems. Challenges remain, including concerns about data privacy, algorithmic bias, and the need for robust regulatory frameworks. Despite these hurdles, the overall outlook for the FER market remains optimistic, with considerable opportunities for growth and development in the coming years. The key to success for market players will be a focus on developing ethical and accurate technologies while addressing the inherent privacy concerns. This will involve continuous improvement of algorithms, transparent data handling practices, and engagement with stakeholders to build trust and ensure responsible innovation.

  8. UIBAIFED - Artificial Intelligence Facial Expression Dataset

    • zenodo.org
    zip
    Updated Apr 9, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Miquel Mascaró Oliver; Miquel Mascaró Oliver (2025). UIBAIFED - Artificial Intelligence Facial Expression Dataset [Dataset]. http://doi.org/10.5281/zenodo.15181380
    Explore at:
    zipAvailable download formats
    Dataset updated
    Apr 9, 2025
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Miquel Mascaró Oliver; Miquel Mascaró Oliver
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Time period covered
    Apr 2025
    Description

    UIBAIFED, a novel facial expression dataset designed to enhance Facial Expression Recognition (FER) by providing high-quality, realistic images labeled with detailed demographic attributes, including age group, gender, and ethnicity. Unlike existing datasets, UIBAIFED incorporates a fine-grained classification of 22 micro-expressions, based on the universal facial expressions defined by Ekman and the micro-expression taxonomy proposed by Gary Faigin. The dataset was generated using Stable Diffusion and validated through a convolutional neural network (CNN), achieving an accuracy of 82% in expression classification. The results highlight the dataset’s reliability and potential to improve FER systems. UIBAIFED fills a critical gap in the field by offering a more comprehensive labeling system, enabling future research on expression recognition across different demographic groups and advancing the robustness of FER models in diverse applications.

  9. Facial Emotion Expressions (FER) Balanced

    • kaggle.com
    Updated Mar 27, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ashish Sawant13 (2025). Facial Emotion Expressions (FER) Balanced [Dataset]. https://www.kaggle.com/datasets/ashishsawant13/facial-emotion-expressions-fer-balanced/code
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Mar 27, 2025
    Dataset provided by
    Kagglehttp://kaggle.com/
    Authors
    Ashish Sawant13
    License

    MIT Licensehttps://opensource.org/licenses/MIT
    License information was derived automatically

    Description

    This dataset is a modified version of the dataset "Facial Emotion Expressions" available here -> dataset link.
    This modified version handles the problem of class imbalance by adding augmented images of class "disgust" (original amount : 432, new amount : 4300+) and reducing the number of images in class happy(original amount : 7300+, new amount : 4500).
    This modification was done to try and address the problem of class imbalance. Peace out 😊✌️.

  10. h

    FER-2013

    • huggingface.co
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jim, FER-2013 [Dataset]. https://huggingface.co/datasets/JimmyUnleashed/FER-2013
    Explore at:
    Authors
    Jim
    License

    MIT Licensehttps://opensource.org/licenses/MIT
    License information was derived automatically

    Description

    Facial Expression Recognition Challenge (ICML 2013)

      Overview
    

    This dataset was created for the Challenges in Representation Learning: Facial Expression Recognition Challenge, part of the ICML 2013 Workshop. The challenge focused on evaluating how well learning algorithms can generalize to newly introduced data, particularly for facial expression classification.

    Start Date: April 13, 2013
    End Date: May 25, 2013

    The dataset contains facial images labeled with one of… See the full description on the dataset page: https://huggingface.co/datasets/JimmyUnleashed/FER-2013.

  11. R

    Fer Disgust Dataset

    • universe.roboflow.com
    zip
    Updated Jul 3, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    APU (2024). Fer Disgust Dataset [Dataset]. https://universe.roboflow.com/apu-hyumi/fer-disgust/dataset/1
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jul 3, 2024
    Dataset authored and provided by
    APU
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Variables measured
    Facial Expression Bounding Boxes
    Description

    FER Disgust

    ## Overview
    
    FER Disgust is a dataset for object detection tasks - it contains Facial Expression annotations for 461 images.
    
    ## Getting Started
    
    You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
    
      ## License
    
      This dataset is available under the [CC BY 4.0 license](https://creativecommons.org/licenses/CC BY 4.0).
    
  12. Balanced Affectnet Dataset (75×75, RGB)

    • kaggle.com
    Updated Apr 30, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    dolly prajapati 182 (2025). Balanced Affectnet Dataset (75×75, RGB) [Dataset]. https://www.kaggle.com/datasets/dollyprajapati182/balanced-affectnet
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Apr 30, 2025
    Dataset provided by
    Kagglehttp://kaggle.com/
    Authors
    dolly prajapati 182
    Description

    The Balanced Affectnet Dataset is a uniformly processed, class-balanced, and augmented version of the affect-fer composite dataset. This curated version is tailored for deep learning and machine learning applications in Facial Emotion Recognition (FER). It addresses class imbalance and standardizes input dimensions to enhance model performance and comparability.

    🎯 Purpose The goal of this dataset is to balance the representation of seven basic emotions, enabling the training of fairer and more robust FER models. Each emotion class contains an equal number of images, facilitating consistent model learning and evaluation across all classes.

    🧾 Dataset Characteristics Source: Based on the Affectnet dataset

    Image Format: RGB .png

    Image Size: 75 × 75 pixels

    Emotion 8-Classes: Anger Contempt disgust fear happy neutral sad surprise

    Total Images: 41,008

    Images per Class: 5,126

    ⚙️ Preprocessing Pipeline Each image in the dataset has been preprocessed using the following steps:

    ✅ Converted to grayscale

    ✅ Resized to 75×75 pixels

    ✅ Augmented using:

    Random rotation

    Horizontal flip

    Brightness adjustment

    Contrast enhancement

    Sharpness modification

    This results in a clean, uniform, and diverse dataset ideal for FER tasks.

    Testing (10%): 4100 images

    Training (80% of remainder): 29526 images

    Validation (20% of remainder): 7,382 images

    ✅ Advantages ⚖️ Balanced Classes: Equal images across all seven emotions

    🧠 Model-Friendly: Grayscale, resized format reduces preprocessing overhead

    🚀 Augmented: Improves model generalization and robustness

    📦 Split Ready: Train/Val/Test folders structured per class

    📊 Great for Benchmarking: Ideal for training CNNs, Transformers, and ensemble models for FER

  13. S

    NPU-FER: A web image dataset for facial expression recognition

    • scidb.cn
    Updated Oct 17, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Xianlin Peng; Zhaoqiang Xia; Lei Li; Xiaoyi Feng (2021). NPU-FER: A web image dataset for facial expression recognition [Dataset]. http://doi.org/10.11922/sciencedb.01199
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Oct 17, 2021
    Dataset provided by
    Science Data Bank
    Authors
    Xianlin Peng; Zhaoqiang Xia; Lei Li; Xiaoyi Feng
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset for natural facial expression recognition was constructed by leveraging the web images. An amount of labeled web images were obtained from the image search engines by using specific keywords. The algorithms of junk image cleansing were then utilized to remove the mislabeled images and further cleaned by the manual labeling. At last , totally 1648 images for six categories of facial expressions were collected, which have unbalanced number and different resolution images for each category.

  14. R

    Fer Dataset

    • universe.roboflow.com
    zip
    Updated Nov 5, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    TJU (2024). Fer Dataset [Dataset]. https://universe.roboflow.com/tju-ete2s/fer-neqll/model/4
    Explore at:
    zipAvailable download formats
    Dataset updated
    Nov 5, 2024
    Dataset authored and provided by
    TJU
    Variables measured
    Cats AtyT Bounding Boxes
    Description

    FER

    ## Overview
    
    FER is a dataset for object detection tasks - it contains Cats AtyT annotations for 745 images.
    
    ## Getting Started
    
    You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
    
  15. Emotion Detection

    • kaggle.com
    Updated Dec 11, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    ARES (2020). Emotion Detection [Dataset]. https://www.kaggle.com/ananthu017/emotion-detection-fer/notebooks
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Dec 11, 2020
    Dataset provided by
    Kaggle
    Authors
    ARES
    License

    https://creativecommons.org/publicdomain/zero/1.0/https://creativecommons.org/publicdomain/zero/1.0/

    Description

    The dataset contain 35,685 examples of 48x48 pixel gray scale images of faces divided into train and test dataset. Images are categorized based on the emotion shown in the facial expressions (happiness, neutral, sadness, anger, surprise, disgust, fear).

  16. FER-2013 subset ES-MAPs

    • zenodo.org
    zip
    Updated Jun 25, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Emmy Yang; Emmy Yang (2023). FER-2013 subset ES-MAPs [Dataset]. http://doi.org/10.5281/zenodo.8068171
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jun 25, 2023
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Emmy Yang; Emmy Yang
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    A modified subset of the FER-2013 dataset generated as ES-MAPs (emotional state heatmaps)

  17. AFFEC Multimodal Dataset

    • zenodo.org
    json, zip
    Updated Mar 18, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Meisam Jamshidi Seikavandi; Meisam Jamshidi Seikavandi; Laurits Dixen; Laurits Dixen; Paolo Burelli; Paolo Burelli (2025). AFFEC Multimodal Dataset [Dataset]. http://doi.org/10.5281/zenodo.14794876
    Explore at:
    zip, jsonAvailable download formats
    Dataset updated
    Mar 18, 2025
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Meisam Jamshidi Seikavandi; Meisam Jamshidi Seikavandi; Laurits Dixen; Laurits Dixen; Paolo Burelli; Paolo Burelli
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Time period covered
    May 1, 2024
    Description

    Dataset: AFFEC - Advancing Face-to-Face Emotion Communication Dataset

    Overview

    The AFFEC (Advancing Face-to-Face Emotion Communication) dataset is a multimodal dataset designed for emotion recognition research. It captures dynamic human interactions through electroencephalography (EEG), eye-tracking, galvanic skin response (GSR), facial movements, and self-annotations, enabling the study of felt and perceived emotions in real-world face-to-face interactions. The dataset comprises 84 simulated emotional dialogues, 72 participants, and over 5,000 trials, annotated with more than 20,000 emotion labels.

    Dataset Structure

    The dataset follows the Brain Imaging Data Structure (BIDS) format and consists of the following components:

    Root Folder:

    • sub-* : Individual subject folders (e.g., sub-aerj, sub-mdl, sub-xx2)
    • dataset_description.json: General dataset metadata
    • participants.json and participants.tsv: Participant demographics and attributes
    • task-fer_events.json: Event annotations for the FER task
    • README.md: This documentation file

    Subject Folders (sub-):

    Each subject folder contains:

    • Behavioral Data (beh/): Physiological recordings (eye tracking, GSR, facial analysis, cursor tracking) in JSON and TSV formats.
    • EEG Data (eeg/): EEG recordings in .edf and corresponding metadata in .json.
    • Event Files (*.tsv): Trial event data for the emotion recognition task.
    • Channel Descriptions (*_channels.tsv): EEG channel information.

    Data Modalities and Channels

    1. Eye Tracking Data

    • Channels: 16 (fixation points, left/right eye gaze coordinates, gaze validity)
    • Sampling Rate: 62 Hz
    • Trials: 5632
    • File Example: sub-

    2. Pupil Data

    • Channels: 21 (pupil diameter, eye position, pupil validity flags)
    • Sampling Rate: 149 Hz
    • Trials: 5632
    • File Example: sub-

    3. Cursor Tracking Data

    • Channels: 4 (cursor X, cursor Y, cursor state)
    • Sampling Rate: 62 Hz
    • Trials: 5632
    • File Example: sub-

    4. Face Analysis Data

    • Channels: Over 200 (2D/3D facial landmarks, gaze detection, facial action units)
    • Sampling Rate: 40 Hz
    • Trials: 5680
    • File Example: sub-

    5. Electrodermal Activity (EDA) and Physiological Sensors

    • Channels: 40 (GSR, body temperature, accelerometer data)
    • Sampling Rate: 50 Hz
    • Trials: 5438
    • File Example: sub-

    6. EEG Data

    • Channels: 63 (EEG electrodes following the 10-20 placement scheme)
    • Sampling Rate: 256 Hz
    • Reference: Left earlobe
    • Trials: 5632
    • File Example: sub-

    7. Self-Annotations

    • Trials: 5807
    • Annotations Per Trial: 4
    • Event Markers: Onset time, duration, trial type, emotion labels
    • File Example: task-fer_events.json

    Experimental Setup

    Participants engaged in a Facial Emotion Recognition (FER) task, where they watched emotionally expressive video stimuli while their physiological and behavioral responses were recorded. Participants provided self-reported ratings for both perceived and felt emotions, differentiating between the emotions they believed the video conveyed and their internal affective experience.

    The dataset enables the study of individual differences in emotional perception and expression by incorporating Big Five personality trait assessments and demographic variables.

    Usage Notes

    • The dataset is formatted in ASCII/UTF-8 encoding.
    • Each modality is stored in JSON, TSV, or EDF format as per BIDS standards.
    • Researchers should cite this dataset appropriately in publications.

    Applications

    AFFEC is well-suited for research in:

    • Affective Computing
    • Human-Agent Interaction
    • Emotion Recognition and Classification
    • Multimodal Signal Processing
    • Neuroscience and Cognitive Modeling
    • Healthcare and Mental Health Monitoring

    Acknowledgments

    This dataset was collected with the support of brAIn lab, IT University of Copenhagen.
    Special thanks to all participants and research staff involved in data collection.

    License

    This dataset is shared under the Creative Commons CC0 License.

    Contact

    For questions or collaboration inquiries, please contact [brainlab-staff@o365team.itu.dk].

  18. P

    DFEW Dataset

    • library.toponeai.link
    Updated Jun 16, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Xingxun Jiang; Yuan Zong; Wenming Zheng; Chuangao Tang; Wanchuang Xia; Cheng Lu; Jiateng Liu (2024). DFEW Dataset [Dataset]. https://library.toponeai.link/dataset/dfew
    Explore at:
    Dataset updated
    Jun 16, 2024
    Authors
    Xingxun Jiang; Yuan Zong; Wenming Zheng; Chuangao Tang; Wanchuang Xia; Cheng Lu; Jiateng Liu
    Description

    Recently, facial expression recognition (FER) in the wild has gained a lot of researchers’ attention because it is a valuable topic to enable the FER techniques to move from the laboratory to the real applications. In this paper, we focus on this challenging but interesting topic and make contributions from three aspects. First, we present a new large-scale ’in-the-wild’ dynamic facial expression database, DFEW (Dynamic Facial Expression in the Wild), consisting of over 16,000 video clips from thousands of movies. These video clips contain various challenging interferences in practical scenarios such as extreme illumination, occlusions, and capricious pose changes. Second, we propose a novel method called ExpressionClustered Spatiotemporal Feature Learning (EC-STFL) framework to deal with dynamic FER in the wild. Third, we conduct extensive benchmark experiments on DFEW using a lot of spatiotemporal deep feature learning methods as well as our proposed EC-STFL. Experimental results show that DFEW is a well-designed and challenging database, and the proposed EC-STFL can promisingly improve the performance of existing spatiotemporal deep neural networks in coping with the problem of dynamic FER in the wild. Our DFEW database is publicly available and can be freely downloaded from https://dfew-dataset.github.io/.

  19. R

    Fer Classification Dataset

    • universe.roboflow.com
    zip
    Updated Feb 22, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Interns (2025). Fer Classification Dataset [Dataset]. https://universe.roboflow.com/interns-yw3ts/fer-classification
    Explore at:
    zipAvailable download formats
    Dataset updated
    Feb 22, 2025
    Dataset authored and provided by
    Interns
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Variables measured
    Emotion
    Description

    FER Classification

    ## Overview
    
    FER Classification is a dataset for classification tasks - it contains Emotion annotations for 329 images.
    
    ## Getting Started
    
    You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
    
      ## License
    
      This dataset is available under the [CC BY 4.0 license](https://creativecommons.org/licenses/CC BY 4.0).
    
  20. FER-2013

    • kaggle.com
    Updated Jul 19, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Manas Sambare (2020). FER-2013 [Dataset]. https://www.kaggle.com/datasets/msambare/FER2013/discussion
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Jul 19, 2020
    Dataset provided by
    Kagglehttp://kaggle.com/
    Authors
    Manas Sambare
    License

    http://opendatacommons.org/licenses/dbcl/1.0/http://opendatacommons.org/licenses/dbcl/1.0/

    Description

    The data consists of 48x48 pixel grayscale images of faces. The faces have been automatically registered so that the face is more or less centred and occupies about the same amount of space in each image.

    The task is to categorize each face based on the emotion shown in the facial expression into one of seven categories (0=Angry, 1=Disgust, 2=Fear, 3=Happy, 4=Sad, 5=Surprise, 6=Neutral). The training set consists of 28,709 examples and the public test set consists of 3,589 examples.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Dataintelo (2025). Facial Emotion Recognition Fer Market Report | Global Forecast From 2025 To 2033 [Dataset]. https://dataintelo.com/report/facial-emotion-recognition-fer-market

Facial Emotion Recognition Fer Market Report | Global Forecast From 2025 To 2033

Explore at:
pptx, csv, pdfAvailable download formats
Dataset updated
Jan 7, 2025
Dataset authored and provided by
Dataintelo
License

https://dataintelo.com/privacy-and-policyhttps://dataintelo.com/privacy-and-policy

Time period covered
2024 - 2032
Area covered
Global
Description

Facial Emotion Recognition (FER) Market Outlook



The global facial emotion recognition (FER) market size was valued at approximately USD 4.2 billion in 2023 and is projected to reach around USD 12.5 billion by 2032, growing at a robust CAGR of 12.8% during the forecast period. The increasing adoption of artificial intelligence and machine learning technologies in various applications, including healthcare, retail, and security, is a major growth factor driving this market.



A significant growth factor for the FER market is the rising demand for enhanced security systems. The integration of FER technology in surveillance systems has enabled more accurate monitoring and identification of individuals based on their emotional states. Security agencies and law enforcement bodies are increasingly deploying these systems to preemptively detect suspicious activities, thereby reducing potential threats. Additionally, the ongoing advancements in AI algorithms have significantly improved the accuracy and efficiency of FER systems, further boosting their adoption across various sectors.



Another driver is the healthcare sector's growing interest in utilizing FER for patient care and emotional well-being. FER technology is being used to monitor patients' emotional states, providing insights that can improve mental health treatment and care plans. It is particularly beneficial in scenarios where patients may have difficulty communicating their feelings, such as in the case of certain neurological disorders. Moreover, the COVID-19 pandemic has accelerated the adoption of telemedicine, where FER can play a crucial role in remote patient monitoring, making the technology even more valuable in contemporary healthcare settings.



The retail sector is also a pivotal growth area for the FER market. Retailers are increasingly leveraging FER technology to enhance customer experience and optimize sales strategies. By analyzing customers' facial expressions, retailers can gain insights into consumer preferences and behaviors, allowing them to personalize marketing efforts and improve product placements. This data-driven approach helps in increasing customer satisfaction and loyalty, thereby positively impacting the market growth.



The role of Emotional intelligence in technology is becoming increasingly significant, especially in the realm of facial emotion recognition. As businesses and healthcare providers strive to understand and respond to human emotions, the integration of emotional intelligence into FER systems is proving invaluable. This integration allows systems to not only recognize facial expressions but also interpret the underlying emotional context, leading to more nuanced and effective interactions. By leveraging emotional insights, companies can enhance customer experiences, improve patient care, and even bolster security measures. The ability to accurately gauge emotions is transforming how industries engage with individuals, making emotional intelligence a cornerstone of modern FER technology.



Regionally, North America holds the dominant share in the FER market, driven by substantial investments in AI technologies and the presence of major tech companies. The region's advanced technological infrastructure and high adoption rate of innovative solutions contribute significantly to market growth. Additionally, Europe and Asia Pacific are emerging as lucrative markets due to increasing investments in security and surveillance systems and growing awareness about the benefits of FER technology. The Asia Pacific region, in particular, is expected to witness the highest CAGR during the forecast period, fueled by rapid economic growth and technological advancements in countries like China and India.



Component Analysis



The FER market by component is segmented into software, hardware, and services. The software segment holds a significant share of the market due to the increasing demand for advanced AI algorithms and machine learning models that enhance FER capabilities. Software solutions are crucial for processing and analyzing facial expressions, and continuous advancements in AI technology are driving their adoption. Many companies are investing heavily in developing sophisticated FER software that can be integrated seamlessly into various applications, from security systems to customer service platforms.



Hardware components, such as cameras and sensors, are indispensable for capturing facial data. The hardwa

Search
Clear search
Close search
Google apps
Main menu