Jester Gesture Recognition dataset includes 148,092 labeled video clips of humans performing basic, pre-defined hand gestures in front of a laptop camera or webcam. It is designed for training machine learning models to recognize human hand gestures like sliding two fingers down, swiping left or right and drumming fingers.
This is a subset of the full JESTER dataset, which is a large collection of densely-labeled video clips that show humans performing pre-definded hand gestures in front of a laptop camera or webcam. The dataset was created by a large number of crowd workers. It allows for training robust machine learning models to recognize human hand gestures. For the complete dataset go here. Reference
MIT Licensehttps://opensource.org/licenses/MIT
License information was derived automatically
Jester Event-Based Gesture Dataset (Converted Subset)
Description
This dataset is a custom event-based version of a subset of the original JESTER hand gesture dataset. The JESTER dataset is a large-scale, densely labeled video dataset depicting humans performing predefined hand gestures in front of webcams or laptop cameras. It was originally created by crowd workers and is widely used for training robust machine learning models for gesture recognition.
Conversion… See the full description on the dataset page: https://huggingface.co/datasets/Ishara5/20bn-jester-event.
Not seeing a result you expected?
Learn how you can add new datasets to our index.
Jester Gesture Recognition dataset includes 148,092 labeled video clips of humans performing basic, pre-defined hand gestures in front of a laptop camera or webcam. It is designed for training machine learning models to recognize human hand gestures like sliding two fingers down, swiping left or right and drumming fingers.