Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The continued effort to study gait kinematics and the increased interest in identifying individuals based on their gait patterns, could be strengthened by the inclusion of data from older groups. To address this need and complement our previous database on healthy young adults, we present an addition to the Nonlinear Analysis Core (NONAN) GaitPrint database. We offer full-body inertial measurement data during self-paced overground walking on a 200 m indoor track of 41 older adults (56+ years old; 20 men and 21 women; age: 64.7 ± 7.5 years; height: 1.7 ± 0.1 m; body mass: 81.1 ± 17.8 kg) across 18 four-minute trials conducted over two days. The multiple recordings are supported by a range of pre-calculated spatiotemporal variables, a list of each subject’s anthropometrics, notes for each walking trial, and template scripts for easier application of our data to classroom assignments or laboratory research. In addition, a preliminary Bayesian analysis found a range of evidence supporting age-related gait changes between this database and our database on young adults.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The PHYTMO database contains data from physical therapy exercises and gait variations recorded with magneto-inertial sensors, including information from an optical reference system. PHYTMO includes the recording of 30 volunteers, aged between 20 and 70 years old. A total amount of 6 exercises and 3 gait variations commonly prescribed in physical therapies were recorded. The volunteers performed two series with a minimum of 8 repetitions in each one. Four magneto-inertial sensors were placed on the lower-or upper-limbs for the recording of the motions together with passive optical reflectors. The files include the specifications of the inertial sensors and the cameras. The database includes magneto-inertial data (linear acceleration, turn rate and magnetic field), together with a highly accurate location and orientation in the 3D space provided by the optical system (errors are lower than 1mm). The database files were stored in CSV format to ensure usability with common data processing software. The main aim of this dataset is the availability of inertial data for two main purposes: the analysis of different techniques for the identification and evaluation of exercises monitored with inertial wearable sensors and the validation of inertial sensor-based algorithms for human motion monitoring that obtains segments orientation in the 3D space. Furthermore, the database stores enough data to train and evaluate Machine Learning-based algorithms. The age range of the participants can be useful for establishing age-based metrics for the exercises evaluation or the study of differences in motions between different aged groups. Finally, the MATLAB function features_extraction, developed by the authors, is also given. This function splits signals using a sliding window, returning its segments, and extract signal features, in the time and frequency domains, based on prior studies of the literature.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The GSTRIDE database contains relevant metrics and motion data of elder people for the assessment of their health status. The data correspond to 163 patients, 45 men and 118 women, between 70 and 98 years old with an average Body Mass Index (BMI) of 26.1±5.0 kg/m2 and a cognitive deterioration status index between 1 and 7, according to the Global Deterioration Scale (GDS) scale. In this way, we ensure variability among the volunteers in terms of socio-demographic and anatomic parameters and their functional and cognitive capacities. The database files are stored in TXT and CSV format to ease their usability with common data processing software.
We provide socio-demographic data, anatomical, functional and cognitive variables, and the outcome measurements from test commonly performed for the evaluation of elder people. The evaluation tests carried out to obtain these data are the Gait Speed Test (4-metre), the Hand Grip Strength, the Short Physical Performance Battery (SPPB), the Timed up and go (TUG) and the Short Falls Efficacy Scale International (FES-I). We also include the outcomes of the GDS questionnaire, the Frailty assessment and the information about falls during the last year prior to the tests.
These data are complemented with the gait parameters of a walking test recorded by an Inertial Measurement Unit (IMU) placed on the foot. The walking tests have an average duration of 21.4±7.1 minutes, which are analyzed in order to estimate the total walking distance, the number of strides and the gait spatio-temporal parameters. The results of this analysis include the following metrics: stride time duration, stride length, step speed, percentage of the gait phases (toe off, swing, heel strike, foot flat) over the strides, foot angle during the toe off and heel strike phases, cadence, step speed, 3D and 2D paths and clearance. We provide these metrics for the steps detected, as well as their average and variance values in the database record.
The raw and calibrated signals from the IMUs using the calibration parameters (bias vector, misalignment and scaling matrix and the sampling rate correction factor) are included in the database in order to allow the researchers to perform other approaches for the gait analysis. These signals consist in the linear acceleration and the turn rate. The files also contain the calibration parameters and the specifications of the inertial sensors used in this work. Furthermore, these data are accompanied with the gait analysis code, which is used to obtain the metrics given in the database, that provides also visualization tools to study the distribution of these metrics.
GSTRIDE is specially focused on, but not limited to, the study of faller and non-faller elder people. The main aim of this dataset is the availability of study these different populations. By including the results of the health evaluation tests and questionnaires and the inertial and spatio-temporal data, researchers can analyze different techniques for the identification of fallers. Moreover, this database allows the analysis of cognitive deterioration and frailty parameters of patients by the research community.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Purpose: Inertial Measurement Unit (IMU) system was developed for tracking wheelchair sports kinematics which installed on the left side of the wheel axle. The researchers developed the data processing system and designed the Graphical User Interface (GUI) that displays the kinematic data of the wheelchair. It allows trainers to track wheelchairs to gather data for analyzing and developing wheelchair athletes’ skills and fitness. It can be used in both competitive and training sessions. This research aims to study the validition of developed IMU system for tracking wheelchair sports kinematics, compared with a 2D motion analysis (2DMA).
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Database Covering the Previously Excluded Daily Life Activities
In biomedical engineering, implants are designed according to the boundary conditions of gait data and tested against. However, due to diversity in cultural backgrounds, religious rituals might cause different ranges of motion and different loading patterns. Especially in the Eastern part of the world, diverse Activities of Daily Living (ADL) consist of salat, yoga rituals, and different style sitting postures. Although databases cover ADL for the Western population, a database covering these diverse activities of the Eastern world, specific to these populations is non-existent. To include previously excluded ADL is a key step in understanding the kinematics and kinetics of these activities. By means of developments in motion capture technologies, excluded ADL data are captured to obtain the coordinate values to calculate the range of motion and the joint reaction forces. This study focuses on data collection protocol and the creation of an online database of previously excluded ADL activities, targeting 200 healthy subjects via Qualisys and IMU motion capture systems, and force plates, from West and Middle East Asian populations. Anthropometrics are known to affect kinematics and kinetics which are also included in the collected data. The current version of the database covers 50 volunteers for 12 different activities, the database aims for 100- male and 100- female healthy volunteers as the final target including C3D and BVH file types. The tasks are defined and listed in a table to create a database to make a query based on age, gender, BMI, type of activity and motion capture system. The data is collected only from a healthy population to understand healthy motion patterns during these previously excluded ADLs. The collected data is to be used for designing implants to allow these sorts of activities to be performed without compromising the quality of life of patients performing these activities in the future.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Chronological overview of the included studies on biomechanical analysis of different tennis groundstroke types, directions, and stance styles.
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
The database presented here provides recordings of everyday walk scenarios in a natural urban environment, including synchronized IMU-, FSR-, and gaze data. Twenty healthy participants (five females, fifteen males, between 18 and 69 years old, 178.5 $\pm$ 7.64 cm, 72.9 $\pm$ 8.7 kg) wore a full-body Lycra suit with 17 IMU sensors, insoles with eight pressure sensing cells per foot, and a mobile eye tracker. They completed three different walk courses, where each trial consisted of several minutes of walking, including a variety of common elements such as ramps, stairs, and pavements. The data is annotated in detail to enable machine-learning-based analysis and prediction.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Chronological overview of the included studies on biomechanical analysis of different tennis serve types and stance styles.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
InstructionsAcquisition ProtocolThe 8th Ninapro database is described in the paper: "Agamemnon Krasoulis, Sethu Vijayakumar & Kianoush Nazarpour. Effect of user adaptation on prosthetic finger control with an intuitive myoelectric decoder, Frontiers in Neuroscience. Please cite this paper for any work related to this database.More information about the protocol can be found in the original paper: "Manfredo Atzori, Arjan Gijsberts, Claudio Castellini, Barbara Caputo, Anne-Gabrielle Mittaz Hager, Simone Elsig, Giorgio Giatsidis, Franco Bassetto & Henning Müller. Electromyography data for non-invasive naturally-controlled robotic hand prostheses. Scientific Data, 2014" (http://www.nature.com/articles/sdata201453)The experiment comprised nine movements including single-finger as well as functional movements. The subjects had to repeat the instructed movements following visual cues (i.e. movies) shown on the screen of a computer monitor.The muscular activity was recorded using 16 active double-differential wireless sensors from a Delsys Trigno IM Wireless EMG system. The sensors comprise EMG electrodes and 9-axis inertial measurement units (IMUs). The sensors were positioned in two rows of eight units around the participants’ right forearm in correspondence to the radiohumeral joint (see pictures below). No specific muscles were targeted. The sensors were fixed on the forearm using the standard manufacturer-provided adhesive bands. Moreover, a hypoallergenic elastic latex-free band was placed around the sensors to keep them fixed during the acquisition. The sEMG signals were sampled at a rate of 1111 Hz, accelerometer and gyroscope data were sampled at 148 Hz, and magnetometer data were sampled at 74 Hz. All signals were upsampled to 2 kHz and post-synchronized.Hand kinematic data were recorded with a dataglove (Cyberglove 2, 18-DOF model). For all participants (i.e. both able-bodied and amputee), the data glove was worn on the left hand (i.e. contralateral to the arm where the EMG sensors were located). The Cyberglove signals correspond to data from the associated Cyberglove sensors located as shown in the picture below ("n/a" corresponds to sensors that were not available, since an 18-DOF model was used). Prior to each experimental session, the data glove was calibrated for the specific participant using the "quick calibration" procedure provided by the manufacturer. The Cyberglove signals were sampled at 100 Hz and subsequently upsampled to 2 kHz and synchronized to EMG and IMU data.Ten able-bodied (Subjects 1-10) and two right-hand transradial amputee participants (Subjects 11-12) are included in the dataset. During the acquisition, the subjects were asked to repeat 9 movements using both hands (bilateral mirrored movements). The duration of each of the nine movements varied between 6 and 9 seconds and consecutive trials were interleaved with 3 seconds of rest. Each repetition started with the participant holding their fingers at the rest state and involved slowly reaching the target posture as shown on the screen and returning to the rest state before the end of the trial. The following movements were included:0. rest1. thumb flexion/extension2. thumb abduction/adduction3. index finger flexion/extension4. middle finger flexion/extension5. combined ring and little fingers flexion/extension6. index pointer7. cylindrical grip8. lateral grip9. tripod grip DatasetsFor each participant, three datasets were collected: the first two datasets (acquisitions 1 & 2) comprised 10 repetitions of each movement and the third dataset (acquisition 3) comprised only two repetitions. For each subject, the associated .zip file contains three MATLAB files in .mat format, that is, one for each dataset, with synchronized variables.The variables included in the .mat files are the following:· subject: subject number· exercise: exercise number (value set to 1 in all data files)· emg (16 columns): sEMG signals from the 16 sensors· acc (48 columns): three-axis accelerometer data from the 16 sensors· gyro (48 columns): three-axis gyroscope data from the 16 sensors· mag (48 columns): three-axis magnetometer data from the 16 sensors· glove (18 columns): calibrated signals from the 18 sensors of the Cyberglove· stimulus (1 column): the movement repeated by the subject· restimulus (1 column): again the movement repeated by the subject. In this case, the duration of the movement label is refined a-posteriori in order to correspond to the real movement.· repetition (1 column): repetition number of the stimulus· rerepetition (1 column): repetition number of restimulus Important notesGiven the nature of the data collection procedure (slow finger movement and lack of extended hold period), this database is intended to be used for estimation/reconstruction of finger movement rather than motion/grip classification. In other words, the purpose of this database is to provide a benchmark for decoding finger position from (contralateral) EMG measurements using regression algorithms as opposed to classification. Therefore, the use of stimulus/restimulus vectors as target variables should be avoided; these are only provided for the user to have access to the exact timings of each movement repetition.Three datasets/acquisitions are provided for each subject. It is recommended that dataset 3, which comprises only two repetitions for each movement, is only used to report performance results and no training or hyper-parameter tuning is performed using this data (i.e. test dataset). The three datasets, which were recorded sequentially, can offer an out-of-the-box three-way split for model training (dataset 1), hyper-parameter tuning/validation (dataset 2), and performance testing (dataset 3). Another possibility is to merge datasets 1 & 2 and perform training and validation/hyper-parameter tuning using K-fold cross-validation, then report performance results on dataset 3.
Open Database License (ODbL) v1.0https://www.opendatacommons.org/licenses/odbl/1.0/
License information was derived automatically
Imbalanced dataset for benchmarking
=======================
The different algorithms of the `imbalanced-learn` toolbox are evaluated on a set of common dataset, which are more or less balanced. These benchmark have been proposed in [1]. The following section presents the main characteristics of this benchmark.
Characteristics
-------------------
|ID |Name |Repository & Target |Ratio |# samples| # features |
|:---:|:----------------------:|--------------------------------------|:------:|:-------------:|:--------------:|
|1 |Ecoli |UCI, target: imU |8.6:1 |336 |7 |
|2 |Optical Digits |UCI, target: 8 |9.1:1 |5,620 |64 |
|3 |SatImage |UCI, target: 4 |9.3:1 |6,435 |36 |
|4 |Pen Digits |UCI, target: 5 |9.4:1 |10,992 |16 |
|5 |Abalone |UCI, target: 7 |9.7:1 |4,177 |8 |
|6 |Sick Euthyroid |UCI, target: sick euthyroid |9.8:1 |3,163 |25 |
|7 |Spectrometer |UCI, target: >=44 |11:1 |531 |93 |
|8 |Car_Eval_34 |UCI, target: good, v good |12:1 |1,728 |6 |
|9 |ISOLET |UCI, target: A, B |12:1 |7,797 |617 |
|10 |US Crime |UCI, target: >0.65 |12:1 |1,994 |122 |
|11 |Yeast_ML8 |LIBSVM, target: 8 |13:1 |2,417 |103 |
|12 |Scene |LIBSVM, target: >one label |13:1 |2,407 |294 |
|13 |Libras Move |UCI, target: 1 |14:1 |360 |90 |
|14 |Thyroid Sick |UCI, target: sick |15:1 |3,772 |28 |
|15 |Coil_2000 |KDD, CoIL, target: minority |16:1 |9,822 |85 |
|16 |Arrhythmia |UCI, target: 06 |17:1 |452 |279 |
|17 |Solar Flare M0 |UCI, target: M->0 |19:1 |1,389 |10 |
|18 |OIL |UCI, target: minority |22:1 |937 |49 |
|19 |Car_Eval_4 |UCI, target: vgood |26:1 |1,728 |6 |
|20 |Wine Quality |UCI, wine, target: <=4 |26:1 |4,898 |11 |
|21 |Letter Img |UCI, target: Z |26:1 |20,000 |16 |
|22 |Yeast _ME2 |UCI, target: ME2 |28:1 |1,484 |8 |
|23 |Webpage |LIBSVM, w7a, target: minority|33:1 |49,749 |300 |
|24 |Ozone Level |UCI, ozone, data |34:1 |2,536 |72 |
|25 |Mammography |UCI, target: minority |42:1 |11,183 |6 |
|26 |Protein homo. |KDD CUP 2004, minority |111:1|145,751 |74 |
|27 |Abalone_19 |UCI, target: 19 |130:1|4,177 |8 |
References
----------
[1] Ding, Zejin, "Diversified Ensemble Classifiers for H
ighly Imbalanced Data Learning and their Application in Bioinformatics." Dissertation, Georgia State University, (2011).
[2] Blake, Catherine, and Christopher J. Merz. "UCI Repository of machine learning databases." (1998).
[3] Chang, Chih-Chung, and Chih-Jen Lin. "LIBSVM: a library for support vector machines." ACM Transactions on Intelligent Systems and Technology (TIST) 2.3 (2011): 27.
[4] Caruana, Rich, Thorsten Joachims, and Lars Backstrom. "KDD-Cup 2004: results and analysis." ACM SIGKDD Explorations Newsletter 6.2 (2004): 95-108.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The objective of the UMAHand dataset is to provide a systematic, Internet-accessible benchmarking database for evaluating algorithms for the automatic identification of manual activities. The database was created by monitoring 29 predefined activities involving specific movements of the dominant hand. These activities were performed by 25 participants, each completing a certain number of repetitions. During each movement, participants wore a 'mote' or Shimmer sensor device on their dominant hand's wrist. This sensor, comparable in weight and volume to a wristwatch, was attached with an elastic band according to a predetermined orientation.The Shimmer device contains an Inertial Measurement Unit (IMU) with a triaxial accelerometer, gyroscope, magnetometer, and barometer. These sensors recorded measurements of acceleration, angular velocity, magnetic field, and atmospheric pressure at a constant sampling frequency of 100 Hz during each movement.The UMAHand Dataset comprises a main directory and three subdirectories: TRACES (containing measurements), VIDEOS (containing video sequences) and SCRIPTS (with two scripts that automate the downloading, unzipping and processing of the dataset). The main directory also includes three descriptive plain text files and an image:• "readme.txt": this file is a brief guide of the dataset which describes the basic characteristics of the database, the testbed or experimental framework used to generate it and the organization of the data file.• "user_characteristics.txt": which contains a line of six numerical (comma-separated) values for each participant describing their personal characteristics in the following order: 1) an abstract user identifier (a number from 01 to 25), 2) a binary value indicating whether the participant is left-handed (0) or right-handed (1), 3) a numerical value indicating gender: male (0), female (1), undefined or undisclosed (2), 4) the weight in kg, and 5) the height in cm and 6) the age in years.• "activity_description.txt": For each activity, this text file incorporates a line with the activity identifier (numbered from 01 to 29) and an alphanumeric string that briefly describes the performed action.• "sensor_orientation.jpg": a JPEG-type image file illustrating the way the sensor is carried and the orientation of the measurement axes.The TRACE subfolder with the data is, in turn, organized into 25 secondary subfolders, one for each participant, named with the word "output" followed by underscore symbol (_) and the corresponding participant identifier (a number from 1 to 25). Each subdirectory contains one CSV (Comma Separated Values) file for each trial (each repetition of any activity) performed by the corresponding volunteer.The filenames with the monitored data follow the following format: "user_XX_activity_YY_trial_ZZ.csv" where XX, YY, and ZZ represent the identifiers of the participant (XX), the activity (YY) and the repetition number (ZZ), respectively.In the files, which do not include any header, each line corresponds to a sample taken by the sensing node. Thus, each line of the CSV files presents a set of the simultaneous measurements captured by the sensors of the Shimmer mote at a certain instant. The values in each line are arranged as follows:•Timestamp, Ax, Ay, Az, Gx, Gy, Gz, Mx, My, Mz, Pwhere:-Timestamp is the time indication of the moment when the following measurements were taken. Time is measured in milliseconds elapsed since the start of the recording. Therefore, the first sample, in the first line of the file, has a zero value while the rest of the timestamps in the file are relative to this first sample.-Ax, Ay, Az are the measurements of the three axes of the triaxial accelerometer (in g units).-Gx, Gy, Gz indicate the components of the angular velocity measured by the triaxial gyroscope (in degrees per second or dps).-Mx, My, Mz represent the 3-axis data in microteslas (µT) captured by the magnetometer.-P is the measurement of pressure in millibars.Besides, the VIDEOS directory includes 29 anonymized video clips that illustrate with the corresponding examples the 29 manual activities carried out by the participants. The video files are encoded in MPEG4 format and named according to the format "Example_Activity_XX.mp4", where XX indicates the identifier of the movement (as described in the activity_description.txt file).Finally, the SCRIPTS subfolder comprises two scripts written in Python and Matlab. These two programs (named Load_traces), which perform the same function, are designed to automate the downloading and processing of the data. Specifically, these scripts perform the following tasks:1. Download the database from the public repository as a single compressed zip file.2. Unzip the aforementioned file and create the subfolder structure of the dataset in a specific directory named UMAHand_Dataset. As previously commented, in the subfolder named TRACES, one CSV trace file per each experiment (i.e. per each movement, user, and trial) is created.3. Read all the CSV files and store their information in a list of dictionaries (Python) or a matrix of structures (Matlab) named datasetTraces. Each element in that list/matrix has two fields: the filename (which identifies the user, the type of performed activity, and the trial number) and a numerical array of 11 columns containing the timestamps and the measurements of the sensors for that experiment (arranged as mentioned above).All experiments and data acquisition were conducted in private home environments. Participants were asked to perform those activities involving sustained or continuous hand movements (e.g. clapping hands) for at least 10 seconds. In the case of brief and punctual movements, which might require less than 10 seconds (e.g. picking up an object from the floor), volunteers were simply asked to execute the action until its conclusion. Thus, a total of 752 samples were collected, with durations ranging from 1.98 to 119.98 seconds.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The GAIT2CARE database contains socio-demographic information, functional assessments, and gait data of older adults, collected before and after an 8-week multicomponent physical exercise intervention, with the goal of evaluating health status and temporal evolution.
The dataset includes information from 127 participants, consisting of 85 women (67%) and 42 men (33%), aged between 70 and 93 years (82.36 ± 5,34 years).
Participants were divided into two groups according to the type of exercise program followed:
Group A (on-site): group-based exercise guided by a specialist at the hospital’s setting (n=63).
Group B (app-guided): home-based multicomponent exercise program implemented with remote supervision via the VIVIFIL App providing video instructions, monitored adherence, and allowing chat communication with the healthcare supervisor (n=64).
The study followed a pre-post design, with functional and gait assessments performed at two-time visits: at week 0 (before intervention) and at week 8 (after intervention). The level of compliance/adherence at week 8 with the exercise program is also included: null (<20%), poor (20-50%), medium (50-70%) and remarkable (>70%).
Functional assessment was collected by 4-meter walking test, Time Up and Go (TUG) test, Short Physical Performance Battery (SPPB), the Fried Frailty Criteria and Falls Efficacy Scale – International (FES-I).
Gait data were captured by inertial sensors (IMUs) placed on the feet during walks of 13.07±5.15 minutes duration, which have been analyzed to estimate characteristic gait parameters.
Inertial data from foot-mounted IMUs (acceleration (m/s2), angular velocity (rad/s) and timestamps (s)) are included in the database in .csv files for each participant, trial (week 0 and week 8) and for each foot (right foot (RF) and left foot (LF)), to allow researchers to perform other approaches for gait analysis.
The complete gait analysis is also included for each participant and trial in .csv files, including the gait parameters estimated for all individual steps. The gait parameters included are: cycle duration (CD) (s), cadence (steps/min), stride length (SL) (m), path length 3D (%SL), path length 2D (%SL), stride velocity (m/s), percentage of swing (%CD), percentage of stance (%CD), percentage of stance subphases (loading, foot-flat, and pushing) (%stance), heel strike pitch (degrees), toe-off pitch (degrees), peak angle velocity (degrees/s), turning angle (degrees), heel range of motion (RoM) (degrees), double support (%CD) and stride length normalized (SL/height). The gait analysis has been conducted following the methodology described in [1].
Of the 127 participants initially registered for the GAIT2CARE project, five participants abandoned the exercise program before it was completed and inertial data from several others were lost due to IMU recording failures, or technical/human problems. Consequently, the inertial and gait analysis files include 93 participants for whom complete and valid inertial data were available, 44 for group A (on-site) and 49 for group B (application-guided).
GAIT2CARE is designed to support research on the effectiveness of exercise interventions in older adults, particularly in relation to gait (inertial analysis) and functional status. However, this dataset is also appropriate for extended research on mobility, frailty, fall risk, aging and gait analysis based on foot-mounted inertial sensors.
The study was approved by the Research Ethics Committee on Medicinal Products (CEIm) of the Hospital Universitario de Albacete on June 27, 2023 (Reference code No. 2023-071) and has been prospectively registered on ClinicalTrials.gov with identifier NCT06936865 (https://clinicaltrials.gov/study/NCT06936865).
[1] L. Ruiz-Ruiz, J. J. García-Domínguez and A. R. Jiménez, "A Novel Foot-Forward Segmentation Algorithm for Improving IMU-Based Gait Analysis," in IEEE Transactions on Instrumentation and Measurement, vol. 73, pp. 1-13, 2024, Art no. 4010513, doi: 10.1109/TIM.2024.3449951.
Attribution-NonCommercial-ShareAlike 4.0 (CC BY-NC-SA 4.0)https://creativecommons.org/licenses/by-nc-sa/4.0/
License information was derived automatically
WIFI RSSI Indoor Positioning Dataset
A reliable and comprehensive public WiFi fingerprinting database for researchers to implement and compare the indoor localization’s methods.The database contains RSSI information from 6 APs conducted in different days with the support of autonomous robot. We use an autonomous robot to collect the WiFi fingerprint data. Our 3-wheel robot has multiple sensors including wheel odometer, an inertial measurement unit (IMU), a LIDAR, sonar sensors… See the full description on the dataset page: https://huggingface.co/datasets/Brosnan/WIFI_RSSI_Indoor_Positioning_Dataset.
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
The database presented here provides recordings of everyday walk scenarios in a natural urban environment, including synchronized IMU-, FSR-, and gaze data. Twenty healthy participants (five females, fifteen males, between 18 and 69 years old, 178.5 $\pm$ 7.64 cm, 72.9 $\pm$ 8.7 kg) wore a full-body Lycra suit with 17 IMU sensors, insoles with eight pressure sensing cells per foot, and a mobile eye tracker. They completed three different walk courses, where each trial consisted of several minutes of walking, including a variety of common elements such as ramps, stairs, and pavements. The data is annotated in detail to enable machine-learning-based analysis and prediction.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
NEWBEE: A Multi-Modal Gait Database of Natural Everyday-Walk in an Urban Environment -- AddendumAuthors: M. Hasenjaeger, V. LosingAffiliation: Honda Research Institute Europe GmbHAddress: Carl-Legien-Str. 30, 63065 Offenbach am Main, GermanyCorresponding author: M. HasenjaegerContact information: martina.hasenjaeger@honda-ri.deGENERAL INTRODUCTIONThis dataset is an addendum to the data published in [1]. It contains the gaze positions and eyetracker IMU data (1) in raw format as exported from the eyetracker and (2) synchronized with the remaining data in [1].The data is being made public as supplementary data for publications and in order for other researchers to use these data in their own work.METHODSThe data were recorded from 20 healthy participants (five females, fifteen males, between 18 and 69 years old, 178.5 +/- 7.64 cm, 72.9 +/- 8.7 kg).Data recordings took place around a suburban train station close to Frankfurt/Main in Germany. All participants provided written informed consent, including written permission to publish the data of the study. The study was approved by the Bioethics Committee in Honda’s R&D (97HM-036H, Dec. 14, 2020).The data recording procedure is described in detail in [2].DESCRIPTION OF THE DATA IN THIS DATA SETDIRECTORY STRUCTUREThe data in this dataset has been organized as follows:eyetracker / raw_data / gaze_positions / data-fileseyetracker / raw_data / imu_data / data-fileseyetracker / sync_data / data-filesThe subdirectory 'raw_data' contains the raw data from the PupilInvisible eyetracker as exported by PupilPlayer where the gaze positions are located in 'raw_data/gaze_positions' and the IMU data are located in 'raw_data/imu_data'.The subdirectory 'sync_data' contains gaze positions and IMU data that are synchronized with the data set [1], i.e. the time stamps are identical.FILESeyetracker / raw_data / gaze_positionsFor each experiment course and participant there is a file _-gaze_positions.csv in csv format with the following columns: gaze_timestamp : timestamp of the source image frame world_index : index of closest world video frame (world videos are not included with this data set) confidence : computed confidence between 0 (not confident) -1 (confident) norm_pos_x : x position in the world image frame in normalized coordinates norm_pos_y : y position in the world image frame in normalized coordinates# remaining data columns were not available for Pupil Invisible at time of export and hence are empty base_data : timestamp-id of pupil data that this gaze position is computed from gaze_point_3d_x : x position of the 3d gaze point (the point the subject lookes at) in the world camera coordinate system gaze_point_3d_y : y position of the 3d gaze point gaze_point_3d_z : z position of the 3d gaze point eye_center0_3d_x : x center of eye-ball 0 in the world camera coordinate system (of camera 0 for binocular systems or any eye camera for monocular system) eye_center0_3d_y : y center of eye-ball 0 eye_center0_3d_z : z center of eye-ball 0 gaze_normal0_x : x normal of the visual axis for eye 0 in the world camera coordinate system (of eye 0 for binocular systems or any eye for monocular system). The visual axis goes through the eye ball center and the object thats looked at. gaze_normal0_y : y normal of the visual axis for eye 0 gaze_normal0_z : z normal of the visual axis for eye 0 eye_center1_3d_x : x center of eye-ball 1 in the world camera coordinate system (not avaible for monocular setups.) eye_center1_3d_y : y center of eye-ball 1 eye_center1_3d_z: z center of eye-ball 1 gaze_normal1_x : x normal of the visual axis for eye 1 in the world camera coordinate system (not avaible for monocular setups.). The visual axis goes through the eye ball center and the object thats looked at. gaze_normal1_y : y normal of the visual axis for eye 1 gaze_normal1_z : z normal of the visual axis for eye 1eyetracker / raw_data / imu_dataFor each experiment course and participant there is a file _-imu_data.csv in csv format with the following columns: imu_timestamp : timestamp of the sample world_index : index of closest world video frame (world videos are not included with this data set) gyro_x : rotation speed around x in degrees/s gyro_y : rotation speed around y in degrees/s gyro_z : rotation speed around z in degrees/s accel_x : translational acceleration along the x in G (1 G = 9.80665 m/s^2) accel_y : translational acceleration along the y in G accel_z : translational acceleration along the z in G pitch : drift-free estimation of the pitch (head tilt from front to back) in degrees. The output range is -180 to +180 degrees. roll : drift-free estimation of the roll (head tilt from side to side) in degrees. The output range is -180 to +180 degrees.eyetracker / sync_dataFor each experiment course and participant there is a file eyetracker_.csv in csv format with the following columns: time : experiment time stamp syncronized with data in [1] et_timestamp : IMU timestamp of the sample et_gyro_x : rotation speed around x in degrees/s et_gyro_y : rotation speed around y in degrees/s et_gyro_z : rotation speed around z in degrees/s et_acceleration_x : translational acceleration along the x in G (1 G = 9.80665 m/s^2) et_acceleration_y : translational acceleration along the y in G et_acceleration_z : translational acceleration along the z in G et_pitch : drift-free estimation of the pitch (head tilt from front to back) in degrees. The output range is -180 to +180 degrees. et_roll : drift-free estimation of the roll (head tilt from side to side) in degrees. The output range is -180 to +180 degrees. gaze_confidence : computed gaze position confidence between 0 (not confident) -1 (confident) gaze_norm_pos_x : x position in the world image frame in normalized coordinates gaze_norm_pos_y : y position in the world image frame in normalized coordinatesNOTEThe inertial measurement unit (IMU) is integrated into the right temple of the Pupil Invisible glasses. It is oriented in the frame, such that the x-axis points to the right, the y-axis points downwards and the z-axis points to the front. Gaze data is output in pixel space of the scene camera image, which has a resolution of 1088x1080px. The origin is in the top-left corner of the image. Cf. PupilLabs Invisible documentationREFERENCES[1] Losing, V., Hasenjäger, M.(2022). NEWBEE: A Multi-Modal Gait Database of Natural Everyday-Walk in an Urban Environment. figshare. Collection. doi: 10.6084/m9.figshare.c.5758997.v1[2] Losing, V., & Hasenjäger, M. (2022). A multi-modal gait database of natural everyday-walk in an urban environment. Scientific Data, 9(1), 473. doi: 10.1038/s41597-022-01580-3
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Author: M. HasenjaegerAffiliation: Honda Research Institute Europe GmbHAddress: Carl-Legien-Str. 30, 63065 Offenbach am Main, GermanyCorresponding author: M. HasenjaegerContact information: martina.hasenjaeger@honda-ri.deGeneral introductionThis dataset contains data collected in everyday walk scenarios in a natural urban environment. It is a subset of the data published in [1] that has been supplemented by IMU data recorded with the mobile eye-tracker.The data is being made public as supplementary data for publications and in order for other researchers to use these data in their own work.MethodsThe data were recorded from 20 healthy participants (five females, fifteen males, between 18 and 69 years old, 178.5 ± 7.64 cm, 72.9 ± 8.7 kg).Data recordings took place around a suburban train station close to Frankfurt/Main in Germany. All participants provided written informed consent, including written permission to publish the data of the study. The study was approved by the Bioethics Committee in Honda’s R&D (97HM-036H, Dec. 14, 2020).The data recording procedure is described in detail in [2]. From the three different walk courses described in [2], we only include data from course A for each of the 20 participants. The data is augmented by the data of the head pitch angle that is available from the IMU sensor of the mobile eyetracker. These data are not included with [1]. From the motion data in [1], we only include the lower body joint angles.Baseline pitch angles of zero degrees for head and eyes, respectively, were established empirically for each individual as the average pitch angles measured during straight level walking in segment 3 of course A, cf. Fig. 8 in [2]. The data from the first and last six steps in this segment were considered to be part of walk mode transitions and hence were not used for the computation of the baseline pitch angles. Deviations of the head and eye pitch angles from these empirical baselines are included in this dataset.Transitions between walk modes are labeled more exactly than in [1].Description of the data in this datasetThe data in this dataset has been organized by participant.For each participant, there is a file "gaze_motion_courseA_" in CSV format containing, among others, the following information:time stamp of measurementright and left foot x-, y-, z-position coordinatesright and left hip, knee, and ankle angleshead pitch angleeye pitch anglegaze pitch angleheel strike indicatorsstep indicatorswalk mode, i.e. walk, stairs up, stairs down, ramp up, ramp downindicator of change in walk modeUPDATE (version 1)In the first published version there is an error in the calculation of the baseline values which resulted in the inclusion of steps that were part of the transition phase. This error is corrected in this version (version 1). It affects only columns containing deviations from baseline values. These deviations have been corrected.Deviations from straight level walking for the lower body joint angles are included in this data set.References[1] Losing, V., Hasenjäger, M.(2022). NEWBEE: A Multi-Modal Gait Database of Natural Everyday-Walk in an Urban Environment. figshare. Collection. doi: 10.6084/m9.figshare.c.5758997.v1[2] Losing, V., & Hasenjäger, M. (2022). A multi-modal gait database of natural everyday-walk in an urban environment. Scientific Data, 9(1), 473. doi: 10.1038/s41597-022-01580-3
Not seeing a result you expected?
Learn how you can add new datasets to our index.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The continued effort to study gait kinematics and the increased interest in identifying individuals based on their gait patterns, could be strengthened by the inclusion of data from older groups. To address this need and complement our previous database on healthy young adults, we present an addition to the Nonlinear Analysis Core (NONAN) GaitPrint database. We offer full-body inertial measurement data during self-paced overground walking on a 200 m indoor track of 41 older adults (56+ years old; 20 men and 21 women; age: 64.7 ± 7.5 years; height: 1.7 ± 0.1 m; body mass: 81.1 ± 17.8 kg) across 18 four-minute trials conducted over two days. The multiple recordings are supported by a range of pre-calculated spatiotemporal variables, a list of each subject’s anthropometrics, notes for each walking trial, and template scripts for easier application of our data to classroom assignments or laboratory research. In addition, a preliminary Bayesian analysis found a range of evidence supporting age-related gait changes between this database and our database on young adults.