23 datasets found
  1. R

    Happy Face Dataset

    • universe.roboflow.com
    zip
    Updated Aug 4, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    project (2023). Happy Face Dataset [Dataset]. https://universe.roboflow.com/project-fjp7n/happy-face-pkgvd/dataset/4
    Explore at:
    zipAvailable download formats
    Dataset updated
    Aug 4, 2023
    Dataset authored and provided by
    project
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Variables measured
    Elderly Happy Face Bounding Boxes
    Description

    Happy Face

    ## Overview
    
    Happy Face is a dataset for object detection tasks - it contains Elderly Happy Face annotations for 456 images.
    
    ## Getting Started
    
    You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
    
      ## License
    
      This dataset is available under the [CC BY 4.0 license](https://creativecommons.org/licenses/CC BY 4.0).
    
  2. f

    Facial Emotion Detection Dataset

    • salford.figshare.com
    zip
    Updated Jan 23, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ali Alameer (2025). Facial Emotion Detection Dataset [Dataset]. http://doi.org/10.17866/rd.salford.22495669.v1
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jan 23, 2025
    Dataset provided by
    University of Salford
    Authors
    Ali Alameer
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    The Facial Emotion Detection Dataset is a collection of images of individuals with two different emotions - happy and sad. The dataset was captured using a mobile phone camera and contains photos taken from different angles and backgrounds.

    The dataset contains a total of 637 photos with an additional dataset of 127 from previous work. Out of the total, 402 images are of happy faces, and 366 images are of sad faces. Each individual had a minimum of 10 images of both expressions.

    The project faced challenges in terms of time constraints and people's constraints, which limited the number of individuals who participated. Despite the limitations, the dataset can be used for deep learning projects and real-time emotion detection models. Future work can expand the dataset by capturing more images to improve the accuracy of the model. The dataset can also be used to create a custom object detection model to evaluate other types of emotional expressions.

  3. Happy and Sad Faces Data

    • kaggle.com
    zip
    Updated Jun 18, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    shantanugupta2004 (2024). Happy and Sad Faces Data [Dataset]. https://www.kaggle.com/datasets/shantanugupta2004/happy-and-sad-faces-data/code
    Explore at:
    zip(32378446 bytes)Available download formats
    Dataset updated
    Jun 18, 2024
    Authors
    shantanugupta2004
    License

    MIT Licensehttps://opensource.org/licenses/MIT
    License information was derived automatically

    Description

    Dataset

    This dataset was created by shantanugupta2004

    Released under MIT

    Contents

  4. F

    East Asian Facial Expression Images Dataset

    • futurebeeai.com
    wav
    Updated Aug 1, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    FutureBee AI (2022). East Asian Facial Expression Images Dataset [Dataset]. https://www.futurebeeai.com/dataset/image-dataset/facial-images-expression-east-asia
    Explore at:
    wavAvailable download formats
    Dataset updated
    Aug 1, 2022
    Dataset provided by
    FutureBeeAI
    Authors
    FutureBee AI
    License

    https://www.futurebeeai.com/data-license-agreementhttps://www.futurebeeai.com/data-license-agreement

    Area covered
    East Asia
    Dataset funded by
    FutureBeeAI
    Description

    Introduction

    Welcome to the East Asian Facial Expression Image Dataset, meticulously curated to enhance expression recognition models and support the development of advanced biometric identification systems, KYC models, and other facial recognition technologies.

    Facial Expression Data

    This dataset comprises over 2000 facial expression images, divided into participant-wise sets with each set including:

    Expression Images: 5 different high-quality images per individual, each capturing a distinct facial emotion like Happy, Sad, Angry, Shocked, and Neutral.

    Diversity and Representation

    The dataset includes contributions from a diverse network of individuals across East Asian countries, such as:

    Geographical Representation: Participants from East Asian countries, including China, Japan, Philippines, Malaysia, Singapore, Thailand, Vietnam, Indonesia, and more.
    Participant Profile: Participants range from 18 to 70 years old, representing both males and females in 60:40 ratio, respectively.
    File Format: The dataset contains images in JPEG and HEIC file format.

    Quality and Conditions

    To ensure high utility and robustness, all images are captured under varying conditions:

    Lighting Conditions: Images are taken in different lighting environments to ensure variability and realism.
    Backgrounds: A variety of backgrounds are available to enhance model generalization.
    Device Quality: Photos are taken using the latest mobile devices to ensure high resolution and clarity.

    Metadata

    Each facial expression image set is accompanied by detailed metadata for each participant, including:

    Participant Identifier
    File Name
    Age
    Gender
    Country
    Expression
    Demographic Information
    File Format

    This metadata is essential for training models that can accurately recognize and identify expressions across different demographics and conditions.

    Usage and Applications

    This facial emotion dataset is ideal for various applications in the field of computer vision, including but not limited to:

    Expression Recognition Models: Improving the accuracy and reliability of facial expression recognition systems.
    KYC Models: Streamlining the identity verification processes for financial and other services.
    Biometric Identity Systems: Developing robust biometric identification solutions.
    Generative AI Models: Training generative AI models to create realistic and diverse synthetic facial images.

    Secure and Ethical Collection

    Data Security: Data was securely stored and processed within our platform, ensuring data security and confidentiality.
    Ethical Guidelines: The biometric data collection process adhered to strict ethical guidelines, ensuring the privacy and consent of all participants.
    Participant Consent: All participants were informed of the purpose of collection and potential use of the data, as agreed through written consent.

    Updates and Customization

    We understand the evolving nature of AI and machine learning requirements. Therefore, we continuously add more assets with diverse conditions to this off-the-shelf facial expression dataset.

    Customization & Custom

  5. d

    Pretty crowds are happy crowds - The influence of attractiveness on mood...

    • b2find.dkrz.de
    Updated Feb 5, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). Pretty crowds are happy crowds - The influence of attractiveness on mood perception. Psychological Research [Dataset] - Dataset - B2FIND [Dataset]. https://b2find.dkrz.de/dataset/091d4ccd-351a-5716-9ff1-29b97b0f443a
    Explore at:
    Dataset updated
    Feb 5, 2025
    Description

    Empirical findings predominantly support a happiness superiority effect in visual search and emotion categorization paradigms and reveal that social cues, like sex and race, moderate this advantage. A more recent study showed that the facial attribute attractiveness also influences the accuracy and speed of emotion perception. In the current study, we investigated whether the influence of attractiveness on emotion perception translates into a more general evaluation of moods when more than one emotional target is presented. In two experiments, we used the mood-of-the-crowd (MoC) task to investigate whether attractive crowds are perceived more positively compared to less attractive crowds. The task was to decide whether an array of faces included more angry or more happy faces. Furthermore, we recorded gaze movements to test the assumption that fixations on happy expressions occur more often in attractive crowds. Thirty-four participants took part in experiment 1 as well as in experiment 2. In both experiments, crowds presenting attractive faces were judged as being happy more frequently whereas the reverse pattern was found for unattractive crowds of faces. Moreover, participants were faster and more accurate when evaluating attractive crowds containing more happy faces as well as when judging unattractive crowds composed of more angry expressions. Additionally, in experiment 1, there were more fixations on happy compared to angry expressions in attractive crowds. Overall, the present findings support the assumption that attractiveness moderates emotion perception.

  6. Z

    Data from: Functional decline in facial expression generation in older...

    • data.niaid.nih.gov
    Updated Jun 1, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Takano, Ruriko (2022). Data from: Functional decline in facial expression generation in older women: a cross-sectional study using three-dimensional morphometry [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_4955856
    Explore at:
    Dataset updated
    Jun 1, 2022
    Dataset provided by
    Yamanami, Haruna
    Edlira, Zere
    Tanikawa, Chihiro
    Takada, Kenji
    Takano, Ruriko
    Takata, Sadaki
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    Elderly people show a decline in the ability to decode facial expressions, but also experience age-related facial structure changes that may render their facial expressions harder to decode. However, to date there is no empirical evidence to support the latter mechanism. The objective of this study was to assess the effects of age on facial morphology at rest and during smiling, in younger (n = 100; age range, 18–32 years) and older (n = 30; age range, 55–65 years) Japanese women. Three-dimensional images of each subject's face at rest and during smiling were obtained and wire mesh fitting was performed on each image to quantify the facial surface morphology. The mean node coordinates in each facial posture were compared between the groups using t-tests. Further, the node coordinates of the fitted mesh were entered into a principal component analysis (PCA) and a multifactor analysis of variance (MANOVA) to examine the direct interactions of aging and facial postures on the 3D facial morphology. The results indicated that there were significant age-related 3D facial changes in facial expression generation and the transition from resting to smiling produced a smaller amount of soft tissue movement in the older group than in the younger group. Further, 185 surface configuration variables were extracted and the variables were used to create four discriminant functions: the age-group discrimination for each facial expression, and the facial expression discrimination for each age group. For facial expression discrimination, the older group showed 80% accuracy with 2 of 66 significant variables, whereas the younger group showed 99% accuracy with 15 of 144 significant variables. These results indicate that in both facial expressions, the facial morphology was distinctly different in the younger and older subjects, and that in the older group, the facial morphology during smiling could not be as easily discriminated from the morphology at rest as in the younger group. These results may help to explain one aspect of the communication dysfunction observed in older people.

  7. Research Data and Materials: Sad, Angry and Fearful Facial Expressions...

    • figshare.com
    pdf
    Updated Jan 8, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Rahmi Saylik (2025). Research Data and Materials: Sad, Angry and Fearful Facial Expressions Interfere with Perception of Causal Outcomes [Dataset]. http://doi.org/10.6084/m9.figshare.28162817.v1
    Explore at:
    pdfAvailable download formats
    Dataset updated
    Jan 8, 2025
    Dataset provided by
    figshare
    Figsharehttp://figshare.com/
    Authors
    Rahmi Saylik
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This files present research data and materials (stimuli and forms) about the research entitled Sad, Angry and Fearful Facial Expressions Interfere with Perception of Causal Outcomes

  8. Bodily Emotion Recognition Dataset (BERD)

    • zenodo.org
    Updated Jan 10, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Youngwug Cho; Youngwug Cho; Myeongul Jung; Myeongul Jung; Jungeun Bae; Jungeun Bae; Kwanguk Kim; Kwanguk Kim (2025). Bodily Emotion Recognition Dataset (BERD) [Dataset]. http://doi.org/10.5281/zenodo.14625295
    Explore at:
    Dataset updated
    Jan 10, 2025
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Youngwug Cho; Youngwug Cho; Myeongul Jung; Myeongul Jung; Jungeun Bae; Jungeun Bae; Kwanguk Kim; Kwanguk Kim
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Abstract: The Behavioral Emotion Recognition Dataset (BERD) was developed as part of the research study titled "Behavioral Research Methodologies for Bodily Emotion Recognition." This dataset comprises motion capture data collected from participants performing emotional body movements under various experimental conditions. It is designed to facilitate the development and evaluation of automatic emotion recognition (AER) systems using bodily movement data. The dataset offers insights into the effects of participant acting expertise, motion capture device types, and emotional stimuli on bodily emotion recognition accuracy.

    Key Features:

    1. (Expertise) Participant Data:

    • Actors: Professional actors with at least three years of acting experience.
    • Non-actors: General participants with no formal acting training.
    • Test: General participants with no formal acting training for test

    2. (Devices) Motion Capture Devices:

    • Marker-based motion capture (Optitrack system with 18 infrared cameras).
    • Pose estimation using RGB videos.
    • Kinect motion capture.
    • Mobile phone motion capture (iPhone 12 with ARKit).

    3. (Stimulus) Emotional Stimulus:

    • Word instructions (e.g., "happy," "sad").
    • Picture stimuli (Karolinska Directed Emotional Faces dataset).
    • Video stimuli (validated emotional film clips).

    4. Emotions Recorded:

    • Seven categories: happy, sad, surprised, angry, disgusted, fearful, and neutral.

    5. Data Format:

    • Skeletal data represented as 3D joint coordinates.
    • Sampling rate: 30 frames per second.
    • File format: CSV.

    Potential Applications:

    • Developing deep learning models for bodily emotion recognition.
    • Studying the impact of data collection conditions on emotion recognition accuracy.

    Citation: If you use this dataset in your research, please cite it as follows:

  9. Data from: Colonization history and population genetics of the...

    • zenodo.org
    • data.niaid.nih.gov
    • +2more
    txt
    Updated May 30, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Peter J. P. Croucher; Geoffrey Stuart Oxford; Athena Lam; Neesha Mody; Rosemary G. Gillespie; Peter J. P. Croucher; Geoffrey Stuart Oxford; Athena Lam; Neesha Mody; Rosemary G. Gillespie (2022). Data from: Colonization history and population genetics of the color-polymorphic Hawaiian happy-face spider Theridion grallator (Araneae, Theridiidae) [Dataset]. http://doi.org/10.5061/dryad.338tf52k
    Explore at:
    txtAvailable download formats
    Dataset updated
    May 30, 2022
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Peter J. P. Croucher; Geoffrey Stuart Oxford; Athena Lam; Neesha Mody; Rosemary G. Gillespie; Peter J. P. Croucher; Geoffrey Stuart Oxford; Athena Lam; Neesha Mody; Rosemary G. Gillespie
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    Past geological and climatological processes shape extant biodiversity. In the Hawaiian Islands these processes have provided the physical environment for a number of extensive adaptive radiations. Yet single species that occur throughout the islands provide some of the best cases for understanding how species respond to the shifting dynamics of the islands in the context of colonization history and associated demographic and adaptive shifts. Here we focus on the Hawaiian happy-face spider, a single color-polymorphic species, and use mitochondrial and nuclear allozyme markers to examine 1) how the mosaic formation of the landscape has dictated population structure, and 2) how cycles of expansion and contraction of the habitat matrix have been associated with demographic shifts, including a 'quantum shift' in the genetic basis of the color polymorphism. The results show a marked structure among populations consistent with the age progression of the islands. The finding of low genetic diversity on the youngest site coupled with the very high diversity of haplotypes on the slightly older substrates that are highly dissected by recent volcanism suggest that the mosaic structure of the landscape may play an important role in allowing differentiation of the adaptive color-polymorphism.

  10. faces happy sad

    • kaggle.com
    zip
    Updated Jan 14, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Duaa Zehra Alvi Malik Fakhar Abbas Alvi (2022). faces happy sad [Dataset]. https://www.kaggle.com/duaazehraalvi/faces-happy-sad
    Explore at:
    zip(9541710 bytes)Available download formats
    Dataset updated
    Jan 14, 2022
    Authors
    Duaa Zehra Alvi Malik Fakhar Abbas Alvi
    Description

    Dataset

    This dataset was created by Duaa Zehra Alvi Malik Fakhar Abbas Alvi

    Contents

  11. h

    Data from: reactiongif

    • huggingface.co
    • opendatalab.com
    Updated Apr 10, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Julien Chaumond (2024). reactiongif [Dataset]. https://huggingface.co/datasets/julien-c/reactiongif
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Apr 10, 2024
    Authors
    Julien Chaumond
    License

    https://choosealicense.com/licenses/unknown/https://choosealicense.com/licenses/unknown/

    Description

    ReactionGIF

    From https://github.com/bshmueli/ReactionGIF

      Excerpt from original repo readme
    

    ReactionGIF is a unique, first-of-its-kind dataset of 30K sarcastic tweets and their GIF reactions. To find out more about ReactionGIF, check out our ACL 2021 paper:

    Shmueli, Ray and Ku, Happy Dance, Slow Clap: Using Reaction GIFs to Predict Induced Affect on Twitter

      Citation
    

    If you use our dataset, kindly cite the paper using the following BibTex… See the full description on the dataset page: https://huggingface.co/datasets/julien-c/reactiongif.

  12. f

    Data from: A smile hampers encoding and memory for non-happy eyes in a face:...

    • tandf.figshare.com
    xlsx
    Updated Mar 18, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Aida Gutiérrez-García; Mario Del Líbano; Andrés Fernández-Martín; Manuel G. Calvo (2025). A smile hampers encoding and memory for non-happy eyes in a face: temporal dynamics and importance of initial fixation [Dataset]. http://doi.org/10.6084/m9.figshare.28617028.v1
    Explore at:
    xlsxAvailable download formats
    Dataset updated
    Mar 18, 2025
    Dataset provided by
    Taylor & Francis
    Authors
    Aida Gutiérrez-García; Mario Del Líbano; Andrés Fernández-Martín; Manuel G. Calvo
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Blended facial expressions with a smiling mouth but non-happy eyes (neutral, sad, etc.) are often (incorrectly) judged as “happy”. We investigated the time course of this phenomenon, both forward and backward. To do this, we varied the order of presentation of a prime stimulus (upper half of a face) and a probe (lower half of a face) stimulus, and their display durations. The forward and the backward influence of the smile was assessed when the mouth was seen before or after the eyes. Participants categorised the eye expression when the mouth and the eyes were congruent or incongruent. Results showed that, as a forward prime, a smiling mouth biased the recognition of incongruent (non-happy) eyes as if they were happy. The effect started as early as 100 ms and dissipated by 1000 ms. As a backward prime, the smile also biased recognition of non-happy eye expressions as happy for at least the first 300 ms. These results suggest, respectively, that the presence of a smiling mouth impairs the accurate encoding and memory for non-happy eyes. Angry eyes are the least susceptible to this effect, probably due to their distinctiveness. An alternative response (rather than sensitivity) bias was partially ruled out.

  13. R

    Smile Detection Dataset

    • universe.roboflow.com
    zip
    Updated May 6, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Cris (2024). Smile Detection Dataset [Dataset]. https://universe.roboflow.com/cris-kov4g/smile-detection-a6ac9/model/2
    Explore at:
    zipAvailable download formats
    Dataset updated
    May 6, 2024
    Dataset authored and provided by
    Cris
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Variables measured
    Smiling Face
    Description

    Smile Detection

    ## Overview
    
    Smile Detection is a dataset for classification tasks - it contains Smiling Face annotations for 1,203 images.
    
    ## Getting Started
    
    You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
    
      ## License
    
      This dataset is available under the [CC BY 4.0 license](https://creativecommons.org/licenses/CC BY 4.0).
    
  14. d

    Age differences in emotion perception in a multiple target setting: An...

    • b2find.dkrz.de
    Updated Oct 29, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2023). Age differences in emotion perception in a multiple target setting: An eye-tracking study [Dataset] - Dataset - B2FIND [Dataset]. https://b2find.dkrz.de/dataset/9af84ecd-9f18-5011-927d-58c7a9fcf6b0
    Explore at:
    Dataset updated
    Oct 29, 2023
    Description

    Research focusing on the association between age and emotion perception has revealed inconsistent findings, with some support for an age-related positivity effect, as predicted by socioemotional selectivity theory. We used the mood-of-the-crowd (MoC) task to investigate whether older adults judge a crowd consisting of happy and angry expressions to be dominated by happy faces more frequently. The task was to decide whether an array of faces included more angry or more happy faces. Accuracy, reaction times, and gaze movements were analyzed to test the hypothesis, derived from socioemotional selectivity theory, that age would be associated with a bias toward judging crowds as happy, and with longer and more numerous fixations on happy expressions. Seventy-six participants took part in the study representing three different age groups (young, middle-aged, old). Contrary to the hypothesis, older participants more often judged the emotional crowd to be angry compared to younger participants. Furthermore, whereas fixations were longer for happy faces than for angry faces in younger adults, this difference was not present in older adults. A decline in inhibitory processing in older adults as well as higher cognitive demands of the task are discussed as possible explanations for these findings.

  15. smile-detection

    • kaggle.com
    zip
    Updated Aug 31, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ghousethanedar (2020). smile-detection [Dataset]. https://www.kaggle.com/datasets/ghousethanedar/smiledetection
    Explore at:
    zip(10788287 bytes)Available download formats
    Dataset updated
    Aug 31, 2020
    Authors
    Ghousethanedar
    Description

    Dataset

    This dataset was created by Ghousethanedar

    Contents

  16. o

    Data from: A Comparison of the Affectiva iMotions Facial Expression Analysis...

    • omicsdi.org
    Updated Jul 13, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2023). A Comparison of the Affectiva iMotions Facial Expression Analysis Software With EMG for Identifying Facial Expressions of Emotion. [Dataset]. https://www.omicsdi.org/dataset/biostudies/S-EPMC7058682
    Explore at:
    Dataset updated
    Jul 13, 2023
    Variables measured
    Unknown
    Description

    Human faces express emotions, informing others about their affective states. In order to measure expressions of emotion, facial Electromyography (EMG) has widely been used, requiring electrodes and technical equipment. More recently, emotion recognition software has been developed that detects emotions from video recordings of human faces. However, its validity and comparability to EMG measures is unclear. The aim of the current study was to compare the Affectiva Affdex emotion recognition software by iMotions with EMG measurements of the zygomaticus mayor and corrugator supercilii muscle, concerning its ability to identify happy, angry and neutral faces. Twenty participants imitated these facial expressions while videos and EMG were recorded. Happy and angry expressions were detected by both the software and by EMG above chance, while neutral expressions were more often falsely identified as negative by EMG compared to the software. Overall, EMG and software values correlated highly. In conclusion, Affectiva Affdex software can identify facial expressions and its results are comparable to EMG findings.

  17. h

    Age differences in emotion perception in a multiple target setting: An...

    • heidata.uni-heidelberg.de
    txt, zip
    Updated Feb 26, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alica Bucher; Andreas Voss; Julia Spaniol; Amelie Hische; Nicole Sauer; Alica Bucher; Andreas Voss; Julia Spaniol; Amelie Hische; Nicole Sauer (2020). Age differences in emotion perception in a multiple target setting: An eye-tracking study [Dataset] [Dataset]. http://doi.org/10.11588/DATA/4X2FXC
    Explore at:
    txt(1455), zip(271454592)Available download formats
    Dataset updated
    Feb 26, 2020
    Dataset provided by
    heiDATA
    Authors
    Alica Bucher; Andreas Voss; Julia Spaniol; Amelie Hische; Nicole Sauer; Alica Bucher; Andreas Voss; Julia Spaniol; Amelie Hische; Nicole Sauer
    License

    https://heidata.uni-heidelberg.de/api/datasets/:persistentId/versions/2.0/customlicense?persistentId=doi:10.11588/DATA/4X2FXChttps://heidata.uni-heidelberg.de/api/datasets/:persistentId/versions/2.0/customlicense?persistentId=doi:10.11588/DATA/4X2FXC

    Description

    Research focusing on the association between age and emotion perception has revealed inconsistent findings, with some support for an age-related positivity effect, as predicted by socioemotional selectivity theory. We used the mood-of-the-crowd (MoC) task to investigate whether older adults judge a crowd consisting of happy and angry expressions to be dominated by happy faces more frequently. The task was to decide whether an array of faces included more angry or more happy faces. Accuracy, reaction times, and gaze movements were analyzed to test the hypothesis, derived from socioemotional selectivity theory, that age would be associated with a bias toward judging crowds as happy, and with longer and more numerous fixations on happy expressions. Seventy-six participants took part in the study representing three different age groups (young, middle-aged, old). Contrary to the hypothesis, older participants more often judged the emotional crowd to be angry compared to younger participants. Furthermore, whereas fixations were longer for happy faces than for angry faces in younger adults, this difference was not present in older adults. A decline in inhibitory processing in older adults as well as higher cognitive demands of the task are discussed as possible explanations for these findings.

  18. f

    DataSheet_1_Elevated accuracy in recognition of subliminal happy facial...

    • frontiersin.figshare.com
    pdf
    Updated Jun 13, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Zirong Qian; Yunbo Yang; Katharina Domschke; Alexander L. Gerlach; Alfons Hamm; Jan Richter; Martin J. Herrmann; Jürgen Deckert; Volker Arolt; Peter Zwanzger; Martin Lotze; Bettina Pfleiderer; Hans-Ulrich Wittchen; Thomas Lang; Andreas Ströhle; Carsten Konrad; Winfried Rief; Thomas Suslow; Andreas Jansen; Tilo Kircher; Benjamin Straube (2024). DataSheet_1_Elevated accuracy in recognition of subliminal happy facial expressions in patients with panic disorder after psychotherapy.pdf [Dataset]. http://doi.org/10.3389/fpsyt.2024.1375751.s001
    Explore at:
    pdfAvailable download formats
    Dataset updated
    Jun 13, 2024
    Dataset provided by
    Frontiers
    Authors
    Zirong Qian; Yunbo Yang; Katharina Domschke; Alexander L. Gerlach; Alfons Hamm; Jan Richter; Martin J. Herrmann; Jürgen Deckert; Volker Arolt; Peter Zwanzger; Martin Lotze; Bettina Pfleiderer; Hans-Ulrich Wittchen; Thomas Lang; Andreas Ströhle; Carsten Konrad; Winfried Rief; Thomas Suslow; Andreas Jansen; Tilo Kircher; Benjamin Straube
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    BackgroundIndividuals with anxiety disorders (ADs) often display hypervigilance to threat information, although this response may be less pronounced following psychotherapy. This study aims to investigate the unconscious recognition performance of facial expressions in patients with panic disorder (PD) post-treatment, shedding light on alterations in their emotional processing biases.MethodsPatients with PD (n=34) after (exposure-based) cognitive behavior therapy and healthy controls (n=43) performed a subliminal affective recognition task. Emotional facial expressions (fearful, happy, or mirrored) were displayed for 33 ms and backwardly masked by a neutral face. Participants completed a forced choice task to discriminate the briefly presented facial stimulus and an uncovered condition where only the neutral mask was shown. We conducted a secondary analysis to compare groups based on their four possible response types under the four stimulus conditions and examined the correlation of the false alarm rate for fear responses to non-fearful (happy, mirrored, and uncovered) stimuli with clinical anxiety symptoms.ResultsThe patient group showed a unique selection pattern in response to happy expressions, with significantly more correct “happy” responses compared to controls. Additionally, lower severity of anxiety symptoms after psychotherapy was associated with a decreased false fear response rate with non-threat presentations.ConclusionThese data suggest that patients with PD exhibited a “happy-face recognition advantage” after psychotherapy. Less symptoms after treatment were related to a reduced fear bias. Thus, a differential facial emotion detection task could be a suitable tool to monitor response patterns and biases in individuals with ADs in the context of psychotherapy.

  19. f

    Table_2_East Asian Young and Older Adult Perceptions of Emotional Faces From...

    • figshare.com
    xlsx
    Updated May 30, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Yu-Zhen Tu; Dong-Wei Lin; Atsunobu Suzuki; Joshua Oon Soo Goh (2023). Table_2_East Asian Young and Older Adult Perceptions of Emotional Faces From an Age- and Sex-Fair East Asian Facial Expression Database.XLSX [Dataset]. http://doi.org/10.3389/fpsyg.2018.02358.s004
    Explore at:
    xlsxAvailable download formats
    Dataset updated
    May 30, 2023
    Dataset provided by
    Frontiers
    Authors
    Yu-Zhen Tu; Dong-Wei Lin; Atsunobu Suzuki; Joshua Oon Soo Goh
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    East Asia
    Description

    There is increasing interest in clarifying how different face emotion expressions are perceived by people from different cultures, of different ages and sex. However, scant availability of well-controlled emotional face stimuli from non-Western populations limit the evaluation of cultural differences in face emotion perception and how this might be modulated by age and sex differences. We present a database of East Asian face expression stimuli, enacted by young and older, male and female, Taiwanese using the Facial Action Coding System (FACS). Combined with a prior database, this present database consists of 90 identities with happy, sad, angry, fearful, disgusted, surprised and neutral expressions amounting to 628 photographs. Twenty young and 24 older East Asian raters scored the photographs for intensities of multiple-dimensions of emotions and induced affect. Multivariate analyses characterized the dimensionality of perceived emotions and quantified effects of age and sex. We also applied commercial software to extract computer-based metrics of emotions in photographs. Taiwanese raters perceived happy faces as one category, sad, angry, and disgusted expressions as one category, and fearful and surprised expressions as one category. Younger females were more sensitive to face emotions than younger males. Whereas, older males showed reduced face emotion sensitivity, older female sensitivity was similar or accentuated relative to young females. Commercial software dissociated six emotions according to the FACS demonstrating that defining visual features were present. Our findings show that East Asians perceive a different dimensionality of emotions than Western-based definitions in face recognition software, regardless of age and sex. Critically, stimuli with detailed cultural norms are indispensable in interpreting neural and behavioral responses involving human facial expression processing. To this end, we add to the tools, which are available upon request, for conducting such research.

  20. f

    Table_1_How Do Induced Affective States Bias Emotional Contagion to Faces? A...

    • figshare.com
    docx
    Updated Jan 31, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Andrés Pinilla; Ricardo M. Tamayo; Jorge Neira (2020). Table_1_How Do Induced Affective States Bias Emotional Contagion to Faces? A Three-Dimensional Model.docx [Dataset]. http://doi.org/10.3389/fpsyg.2020.00097.s001
    Explore at:
    docxAvailable download formats
    Dataset updated
    Jan 31, 2020
    Dataset provided by
    Frontiers
    Authors
    Andrés Pinilla; Ricardo M. Tamayo; Jorge Neira
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Affective states can propagate in a group of people and influence their ability to judge others’ affective states. In the present paper, we present a simple mathematical model to describe this process in a three-dimensional affective space. We obtained data from 67 participants randomly assigned to two experimental groups. Participants watched either an upsetting or uplifting video previously calibrated for this goal. Immediately, participants reported their baseline subjective affect in three dimensions: (1) positivity, (2) negativity, and (3) arousal. In a second phase, participants rated the affect they subjectively judged from 10 target angry faces and ten target happy faces in the same three-dimensional scales. These judgments were used as an index of participant’s affective state after observing the faces. Participants’ affective responses were subsequently mapped onto a simple three-dimensional model of emotional contagion, in which the shortest distance between the baseline self-reported affect and the target judgment was calculated. The results display a double dissociation: negatively induced participants show more emotional contagion to angry than happy faces, while positively induced participants show more emotional contagion to happy than angry faces. In sum, emotional contagion exerted by the videos selectively affected judgments of the affective state of others’ faces. We discuss the directionality of emotional contagion to faces, considering whether negative emotions are more easily propagated than positive ones. Additionally, we comment on the lack of significant correlations between our model and standardized tests of empathy and emotional contagion.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
project (2023). Happy Face Dataset [Dataset]. https://universe.roboflow.com/project-fjp7n/happy-face-pkgvd/dataset/4

Happy Face Dataset

happy-face-pkgvd

happy-face-dataset

Explore at:
zipAvailable download formats
Dataset updated
Aug 4, 2023
Dataset authored and provided by
project
License

Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically

Variables measured
Elderly Happy Face Bounding Boxes
Description

Happy Face

## Overview

Happy Face is a dataset for object detection tasks - it contains Elderly Happy Face annotations for 456 images.

## Getting Started

You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.

  ## License

  This dataset is available under the [CC BY 4.0 license](https://creativecommons.org/licenses/CC BY 4.0).
Search
Clear search
Close search
Google apps
Main menu