Facebook
TwitterThis resource contains the experimental data that was included in tecplot input files but in matlab files. dba1_cp has all the results is dimensioned (7,2) first dimension is 1-7 for each span station 2nd dimension is 1 for upper surface, 2 for lower surface. dba1_cp(ispan,isurf).x are the x/c locations at span station (ispan) and upper(isurf=1) or lower(isurf=2) dba1_cp(ispan,isurf).y are the eta locations at span station (ispan) and upper(isurf=1) or lower(isurf=2) dba1_cp(ispan,isurf).cp are the pressures at span station (ispan) and upper(isurf=1) or lower(isurf=2) Unsteady CP is dimensioned with 4 columns 1st column, real 2nd column, imaginary 3rd column, magnitude 4th column, phase, deg M,Re and other pertinent variables are included as variables and also included in casedata.M, etc
Facebook
Twitterhttp://dcat-ap.ch/vocabulary/licenses/terms_byhttp://dcat-ap.ch/vocabulary/licenses/terms_by
Overview The SUNWELL Modelling Environment is a combination of data and code that models electricity production from satellite-derived irradiance data and other spatial data sets for all of Switzerland. This ensemble accompanies the publication "The bright side of PV production in snow-covered mountains", published in the Proceedings of the National Academy of Science and reproduces all results and figures of. Code and resources are in their original form (with documentation). A new version with a more generalized application to PV modelling and with more flexibility in terms of input and output formats will be released in the coming months.
Format All code is written and has to be executed in Matlab. The input and output data sets are also in the Matlab-specific .mat format. Whenever publicly available, the original data is provided as geotif, .xlsx or other common format. This is the case for:
Measured irradiance for two validation sites (/Validation/ASRB) The ‘Metadata’ documents in the respective folders provide further information about the data sources and processing. Figures are produced either in .pdf or .png format.
Structure The central level of the SUNWELL environment holds the 5 Mains, which run the different modelling aspects of the paper; each code is documented separately. Additional code is located in the ‘DataProcessing’ and the ‘functions’ folder. Functions are called in the different Mains.
‘InputsFromMatlab’ contains the radiation and albedo input data sets in separate subfolders (SIS/SISDIR/ALB). The original data is not publicly available, but can be requested for research purposes free of charge. We provide a processed subset of the data set that was used to run the SUNWELL simulations. The MSG subfolder contains additional spatial input data sets.
‘Outputs’ contains the output files from the different mains (matching names, Main_CHallpixels.m Prod_CHallpixels)
‘Publication_figures’ contains all individual figures from the PNAS publication, as well as the generating code (/code_plot) and the power point figures (/ppts) that provide the combined final figures.
‘Validation’ contains the data sets used in the model validation:
Electricity production from a validation site at Lac des Toules in Wallis (/LDT), this data set was provided under an NDA and cannot be made publicly available.
Paper Citation:
Annelen Kahl; Jérôme Dujardin; Michael Lehning (2018). Dataset on PV Production in Snow Covered Mountains. PNAS - Proceedings of the National Academy of Sciences. (in press)
Facebook
TwitterAttribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
License information was derived automatically
Welcome to the Case Western Reserve University Bearing Data Center Website
This website provides access to ball bearing test data for normal and faulty bearings. Experiments were conducted using a 2 hp Reliance Electric motor, and acceleration data was measured at locations near to and remote from the motor bearings. These web pages are unique in that the actual test conditions of the motor as well as the bearing fault status have been carefully documented for each experiment.
Motor bearings were seeded with faults using electro-discharge machining (EDM). Faults ranging from 0.007 inches in diameter to 0.040 inches in diameter were introduced separately at the inner raceway, rolling element (i.e. ball) and outer raceway. Faulted bearings were reinstalled into the test motor and vibration data was recorded for motor loads of 0 to 3 horsepower (motor speeds of 1797 to 1720 RPM).
Project History
Experiments are often required in order to validate new technologies, theories and techniques. These motor bearing experiments were initiated in order to characterize the performance of IQ PreAlert, a motor bearing condition assessment system developed at Rockwell. From the time of this original impetus, the experimental program has expanded to provide a motor performance database which can be used to validate and/or improve a host of motor condition assessment techniques. Some projects which have recently or are currently making use of this database include: Winsnode condition assessment technology, model based diagnostic techniques, and motor speed determination algorithms.
Apparatus & Procedures
As shown in Figure 1 above, the test stand consists of a 2 hp motor (left), a torque transducer/encoder (center), a dynamometer (right), and control electronics (not shown). The test bearings support the motor shaft. Single point faults were introduced to the test bearings using electro-discharge machining with fault diameters of 7 mils, 14 mils, 21 mils, 28 mils, and 40 mils (1 mil=0.001 inches). See FAULT SPECIFICATIONS for fault depths. SKF bearings were used for the 7, 14 and 21 mils diameter faults, and NTN equivalent bearings were used for the 28 mil and 40 mil faults. Drive end and fan end bearing specifications, including bearing geometry and defect frequencies are listed in the BEARING SPECIFICATIONS.
Vibration data was collected using accelerometers, which were attached to the housing with magnetic bases. Accelerometers were placed at the 12 o’clock position at both the drive end and fan end of the motor housing. During some experiments, an accelerometer was attached to the motor supporting base plate as well. Vibration signals were collected using a 16 channel DAT recorder, and were post processed in a Matlab environment. All data files are in Matlab (*.mat) format. Digital data was collected at 12,000 samples per second, and data was also collected at 48,000 samples per second for drive end bearing faults. Speed and horsepower data were collected using the torque transducer/encoder and were recorded by hand.
Outer raceway faults are stationary faults, therefore placement of the fault relative to the load zone of the bearing has a direct impact on the vibration response of the motor/bearing system. In order to quantify this effect, experiments were conducted for both fan and drive end bearings with outer raceway faults located at 3 o’clock (directly in the load zone), at 6 o’clock (orthogonal to the load zone), and at 12 o’clock
Download a Data File Data was collected for normal bearings, single-point drive end and fan end defects. Data was collected at 12,000 samples/second and at 48,000 samples/second for drive end bearing experiments. All fan end bearing data was collected at 12,000 samples/second.
Data files are in Matlab format. Each file contains fan and drive end vibration data as well as motor rotational speed. For all files, the following item in the variable name indicates:
DE - drive end accelerometer data FE - fan end accelerometer data BA - base accelerometer data time - time series data RPM - rpm during testing
Facebook
TwitterWe provide MATLAB binary files (.mat) and comma separated values files of data collected from a pilot study of a plug load management system that allows for the metering and control of individual electrical plug loads. The study included 15 power strips, each containing 4 channels (receptacles), which wirelessly transmitted power consumption data approximately once per second to 3 bridges. The bridges were connected to a building local area network which relayed data to a cloud-based service. Data were archived once per minute with the minimum, mean, and maximum power draw over each one minute interval recorded. The uncontrolled portion of the testing spanned approximately five weeks and established a baseline energy consumption. The controlled portion of the testing employed schedule-based rules for turning off selected loads during non-business hours; it also modified the energy saver policies for certain devices. Three folders are provided: “matFilesAllChOneDate” provides a MAT-file for each date, each file has all channels; “matFilesOneChAllDates” provides a MAT-file for each channel, each file has all dates; “csvFiles” provides comma separated values files for each date (note that because of data export size limitations, there are 10 csv files for each date). Each folder has the same data; there is no practical difference in content, only the way in which it is organized.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Analyzed sessions data structure for all data collected. Data structures include multidimensional behavioral data extracted from video and external sensors as well as simultaneous photometry recordings from multiple locations in the mouse brain. All datasets are aligned to include the first ~1000 trials of learning for >20 animals. A subset of animals received optogenetic perturbations during learning as described in the paper / methods.
Facebook
TwitterMatlab scripts and functions and data used to build Poly3D models and create permeability potential GIS layers for 1) Mount St. Helens seismic zone, 2) Wind River Valley, and 3) Mount Baker geothermal prospect areas located in Washington state.
Facebook
TwitterOverview The SUMR-D CART2 turbine data are recorded by the CART2 wind turbine's supervisory control and data acquisition (SCADA) system for the Advanced Research Projects Agency–Energy (ARPA-E) SUMR-D project located at the National Renewable Energy Laboratory (NREL) Flatirons Campus. For the project, the CART2 wind turbine was outfitted with a highly flexible rotor specifically designed and constructed for the project. More details about the project can be found here: https://sumrwind.com/. The data include power, loads, and meteorological information from the turbine during startup, operation, and shutdown, and when it was parked and idle. Data Details Additional files are attached: sumr_d_5-Min_Database.mat - a database file in MATLAB format of this dataset, which can be used to search for desired data files; sumr_d_5-Min_Database.xlsx - a database file in Microsoft Excel format of this dataset, which can be used to search for desired data files; loadcartU.m - this script loads in a CART data file and puts it in your workspace as a Matlab matrix (you can call this script from your own Matlab scripts to do your own analysis); charts.mat - this is a dependency file needed for the other scripts (it allows you to make custom preselections for cartPlotU.m); cartLoadHdrU.m - this script loads in the header file information for the data file (the header is embedded in each data file at the beginning); cartPlotU.m - this is a graphic user interface (GUI) that allows you to interactively look at different channels (to use it, run the script in Matlab, and load in the data file(s) of interest; from there, you can select different channels and plot things against each other; note that this script has issues with later versions of MATLAB; the preferred version to use is R2011b). Data Quality Wind turbine blade loading data were calibrated using blade gravity calibrations prior to data collection and throughout the data collection period. Blade loading was also checked for data quality following data collection as strain gauge measurements drifted throughout the data collection. These drifts in the strain gauge measurements were removed in post processing.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset contains a compressed folder of the data and MATLAB scripts used produce relevant figures and candidates for GRITCLEAN: A glitch veto scheme for Gravitational wave data as presented in https://arxiv.org/abs/2401.15237
The codes in this dataset include:
A PSO-based matched filtering search pipeline which can be run on either the positive or the negative chirp time space.
A standalone MATLAB script called GRITCLEAN.m which can run the GRITCLEAN hierarchical vetoes on a set of positive and negative chirp time space estimated parameters.
A plotting script to generate relevant figures.
The files in this dataset include:
GVSsegPSDtrainidxs.mat, a binary MATLAB file containing training indices for all segments from which the Power Spectral Densities (PSDs) are estimated, this is done via the scripts provided, namely, getsegPSD.m and createPSD.m.
A sample HDF5 file used (H-H1_GWOSC_O3a_4KHZ_R1-1243394048-4096.hdf5)
JSON files containing information about the data segments and the strain data files from which they originate from.
Text files containing the parameters estimated by the PSO-based pipeline across the positive and negative chirp time space runs.
Detailed instructions on dependencies, downloading the dataset and running the codes are given in a README.txt file included with this dataset. The user is recommended to go through this file first.
The scripts enclosed have dependencies on JSONLAB , the Parallel Computing Toolbox and Signal Processing Toolbox for MATLAB, along with additional scripts provided in GitHub repositories Accelerated-Network-Analysis and SDMBIGDAT19 . Instructions on installing these dependencies are provided in README.txt.
All codes have been developed and tested on MATLAB R2022 and R2023.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
We are pleased to share the dataset SEED-PQD-v1 (SEED Power Quality Distrubance Dataset v1) used in our study titled "XPQRS: Expert power quality recognition system for sensitive load applications," published in Elsevier Journal Measurement. This dataset is invaluable for researchers and practitioners in the field of power quality analysis, especially those focusing on sensitive load applications. This dataset can be used in Python as well as in MATLAB.
Access the published paper:
https://www.sciencedirect.com/science/article/abs/pii/S0263224123004530
Dataset Details:
Fundamental Frequency: 50 Hz
Sampling Rate: 5 kHz
Number of Classes: 17
Signals per Class: 1000
Length of Each Signal (samples): 100
Length of Each Signal (time): 20 ms
Amplitude of Each Signal: Scaled between -1 to 1
Data Format:
The dataset is available in two formats: MATLAB and CSV.
MATLAB File:
Filename: 5Kfs_1Cycle_50f_1000Sam_1A.mat
Structure: A matrix of dimensions (1000 x 100 x 17), where:
1000 = Signals per class
100 = Samples per signal
17 = Number of classes
Class Order:
Pure_Sinusoidal
Sag
Swell
Interruption
Transient
Oscillatory_Transient
Harmonics
Harmonics_with_Sag
Harmonics_with_Swell
Flicker
Flicker_with_Sag
Flicker_with_Swell
Sag_with_Oscillatory_Transient
Swell_with_Oscillatory_Transient
Sag_with_Harmonics
Swell_with_Harmonics
Notch
CSV Files:
Files: 17 CSV files, one for each class.
Structure: Each CSV file has dimensions (1000 x 100), where:
1000 = Signals per class
100 = Samples per signal
Usage:
This dataset is designed to support the development and testing of power quality recognition systems. The 17 classes cover a broad range of power quality disturbances, providing a comprehensive resource for training machine learning models and validating their performance in recognizing various types of power quality issues.
Acknowledgements:
All users of the dataset are advised to cite the following article:
Citation: Muhammad Umar Khan, Sumair Aziz, Adil Usman, XPQRS: Expert power quality recognition system for sensitive load applications, Measurement, Volume 216, 2023, 112889, ISSN 0263-2241, https://doi.org/10.1016/j.measurement.2023.112889. Link to the article
Thank you for your interest in our work. We hope this dataset facilitates further advancements in power quality analysis and related fields.keywords: Power Quality Recognition, Power Quality Classification, Electrical Signal Analysis, Power System Disturbances, Signal Processing, Power Quality Monitoring
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Wiens, S., van Berlekom, E., Szychowska, M., & Eklund, R. (2019). Visual Perceptual Load Does Not Affect the Frequency Mismatch Negativity. Frontiers in Psychology, 10(1970). doi:10.3389/fpsyg.2019.01970We manipulated visual perceptual load (high and low load) while we recorded electroencephalography. Event-related potentials (ERPs) were computed from these data.OSF_*.pdf contains the preregistration at open science framework (osf).https://doi.org/10.17605/OSF.IO/EWG9XERP_2019_rawdata_bdf.zip contains the raw eeg data files that were recorded with a biosemi system (www.biosemi.com). The files can be opened in matlab with the fieldtrip toolbox. https://www.mathworks.com/products/matlab.htmlhttp://www.fieldtriptoolbox.org/ERP_2019_visual_load_fieldtrip_scripts.zip contains all the matlab scripts that were used to process the ERP data with the toolbox fieldtrip. http://www.fieldtriptoolbox.org/ERP_2019_fieldtrip_mat_*.zip contain the final, preprocessed individual data files. They can be opened with matlab.ERP_2019_visual_load_python_scripts.zip contains the python scripts for the main task. They need python (https://www.python.org/) and psychopy (http://www.psychopy.org/)ERP_2019_visual_load_wmc_R_scripts.zip contains the R scripts to process the working memory capacity (wmc) data. https://www.r-project.org/.ERP_2019_visual_load_R_scripts.zip contains the R scripts to analyze the data and the output files with figures (eg scatterplots). https://www.r-project.org/.
Facebook
TwitterOverview The SUNWELL Modelling Environment is a combination of data and code that models electricity production from satellite-derived irradiance data and other spatial data sets for all of Switzerland. This ensemble accompanies the publication "The bright side of PV production in snow-covered mountains", published in the Proceedings of the National Academy of Science and reproduces all results and figures of. Code and resources are in their original form (with documentation). A new version with a more generalized application to PV modelling and with more flexibility in terms of input and output formats will be released in the coming months. Format All code is written and has to be executed in Matlab. The input and output data sets are also in the Matlab-specific .mat format. Whenever publicly available, the original data is provided as geotif, .xlsx or other common format. This is the case for: - Digital Elevation Model (InputsFromMatlab/MSG/OriginalData/ASTERDEM), - Landsurface cover type (InputsFromMatlab/MSG/OriginalData/CORINE), - Population Density (InputsFromMatlab/MSG/OriginalData/popdensRaster, - Electricity production from three of our validation sites (/Validation/WSL), - Measured irradiance for two validation sites (/Validation/ASRB) The ‘Metadata’ documents in the respective folders provide further information about the data sources and processing. Figures are produced either in .pdf or .png format. Structure The central level of the SUNWELL environment holds the 5 Mains, which run the different modelling aspects of the paper; each code is documented separately. Additional code is located in the ‘DataProcessing’ and the ‘functions’ folder. Functions are called in the different Mains. ‘InputsFromMatlab’ contains the radiation and albedo input data sets in separate subfolders (SIS/SISDIR/ALB). The original data is not publicly available, but can be requested for research purposes free of charge. We provide a processed subset of the data set that was used to run the SUNWELL simulations. The MSG subfolder contains additional spatial input data sets. ‘Outputs’ contains the output files from the different mains (matching names, Main_CHallpixels.m Prod_CHallpixels) ‘Publication_figures’ contains all individual figures from the PNAS publication, as well as the generating code (/code_plot) and the power point figures (/ppts) that provide the combined final figures. ‘Validation’ contains the data sets used in the model validation: - Electricity production from three of our validation sites (/WSL), - Measured irradiance for two validation sites (/ASRB) Electricity production from a validation site at Lac des Toules in Wallis (/LDT), this data set was provided under an NDA and cannot be made publicly available. Paper Citation: > Annelen Kahl; Jérôme Dujardin; Michael Lehning (2018). Dataset on PV Production in Snow Covered Mountains. PNAS - Proceedings of the National Academy of Sciences. (in press)
Facebook
TwitterThis submission contains raw Load Cell data and processing scripts associated with MHKDR submission 394 (UNH TDP - Concurrent Measurements of Inflow, Power Performance, and Loads for a Grid-Synchronized Vertical Axis Cross-Flow Turbine Operating in a Tidal Estuary, DOI: 10.15473/1973860) from the University of New Hampshire and Atlantic Marine Energy Center (AMEC) turbine deployment platform. The user is directed to the MHKDR submission 394 for relevant context and detail of this deployment; see link below. The 394_READ_ME file here provides the description from that submission for quick reference. The READ_ME file for this specific instrument from the 394 submission is also available here. This submission contains a zipped folder structure containing raw data in its original format and MATLAB (2019a) processing scripts used to process and manipulate the data into its final form. The final data products are submitted in the 394 submission.
Facebook
TwitterThis dataset was developed as a means of identifying particular events during the SHEBA drift, and assembling in one place data necessary for driving and verifying ice ocean models. It does not include cloud or precipitation data. It does include data of the following types: meteorological, ice, sheba_gpsdata, turb_mast, profiler, ADP and bathymetry. Please see the Readme for more information.
Facebook
TwitterThis resource contains the experimental data that was included in tecplot input files but in matlab files. dba1_cp has all the results is dimensioned (7,2) first dimension is 1-7 for each span station
Facebook
TwitterThis data release provides remotely sensed data, field measurements, and MATLAB code associated with an effort to produce image-derived velocity maps for a reach of the Sacramento River in California's Central Valley. Data collection occurred from September 16-19, 2024, and involved cooperators from the Intelligent Robotics Group from the National Aeronautics and Space Administration (NASA) Ames Research Center and the National Oceanographic and Atmospheric Administration (NOAA) Southwest Fisheries Science Center. The remotely sensed data were obtained from an Uncrewed Aircraft System (UAS) and are stored in Robot Operating System (ROS) .bag files. Within these files, the various data types are organized into ROS topics including: images from a thermal camera, measurements of the distance from the UAS down to the water surface made with a laser range finder, and position and orientation data recorded by a Global Navigation Satellite System (GNSS) receiver and Inertial Measurement Unit (IMU) during the UAS flights. This instrument suite is part of an experimental payload called the River Observing System (RiOS) designed for measuring streamflow and further detail is provided in the metadata file associated with this data release. For the September 2024 test flights, the RiOS payload was deployed from a DJI Matrice M600 Pro hexacopter hovering approximately 270 m above the river. At this altitude, the thermal images have a pixel size of approximately 0.38 m but are not geo-referenced. Two types of ROS .bag files are provided in separate zip folders. The first, Baguettes.zip, contains "baguettes" that include 15-second subsets of data with a reduced sampling rate for the GNSS and IMU. The second, FullBags.zip, contains the full set of ROS topics recorded by RiOS but have been subset to include only the time ranges during which the UAS was hovering in place over one of 11 cross sections along the reach. The start times are included in the .bag file names as portable operating system interface (posix) time stamps. To view the data within ROS .bag files, the Foxglove Studio program linked below is freely available and provides a convenient interface. Note that to view the thermal images, the contrast will need to be adjusted to minimum and maximum values around 12,000 to 15,000, though some further refinement of these values might be necessary to enhance the display. To enable geo-referencing of the thermal images in a post-processing mode, another M600 hexacopter equipped with a standard visible camera was deployed along the river to acquire images from which an orthophoto was produced: 20240916_SacramentoRiver_Ortho_5cm.tif. This orthophoto has a spatial resolution of 0.05 m and is in the Universal Transverse Mercator (UTM) coordinate system, Zone 10. To assess the accuracy of the orthophoto, 21 circular aluminum ground control targets visible in both thermal and RGB (red, green, blue) images were placed in the field and their locations surveyed with a Real-Time Kinematic (RTK) GNSS receiver. The coordinates of these control points are provided in the file SacGCPs20240916.csv. Please see the metadata for additional information on the camera, the orthophoto production process, and the RTK GNSS survey. The thermal images were used as input to Particle Image Velocimetry (PIV) algorithms to infer surface flow velocities throughout the reach. To assess the accuracy of the resulting image-derived velocity estimates, field measurements of flow velocity were obtained using a SonTek M9 acoustic Doppler current profiler (ADCP). These data were acquired along a series of 11 cross sections oriented perpendicular to the primary downstream flow direction and spaced approximately 150 m apart. At each cross section, the boat from which the ADCP was deployed made four passes across the channel and the resulting data was then aggregated into mean cross sections using the Velocity Mapping Toolbox (VMT) referenced below (Parsons et al., 2013). The VMT output was further processed as described in the metadata and ultimately led to a single comma delimited text file, SacAdcp20240918.csv, with cross section numbers, spatial coordinates (UTM Zone 10N), cross-stream distances, velocity vector components, and water depths. To assess the sensitivity of thermal image velocimetry to environmental conditions, air and water temperatures were recorded using a pair of Onset HOBO U20 pressure transducer data loggers set to record pressure and temperature. Deploying one data logger in the air and one in the water also provided information on variations in water level during the test flights. The resulting temperature and water level time series are provided in the file HoboDataSummary.csv with a one-minute sampling interval. These data sets were used to develop and test a new framework for mapping flow velocities in river channels in approximately real time using images from an UAS as they are acquired. Prototype code for implementing this approach was developed in MATLAB and is also included in the data release as a zip folder called VelocityMappingCode.zip. Further information on the individual functions (*.m files) included within this folder is available in the metadata file associated with this data release. The code is provided as is and is intended for research purposes only. Users are advised to thoroughly read the metadata file associated with this data release to understand the appropriate use and limitations of the data and code provided herein.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Dataset and codes for "Observation of Acceleration and Deceleration Periods at Pine Island Ice Shelf from 1997–2023 "
The MATLAB codes and related datasets are used for generating the figures for the paper "Observation of Acceleration and Deceleration Periods at Pine Island Ice Shelf from 1997–2023".
Files and variables
File 1: Data_and_Code.zip
Directory: Main_function
**Description:****Include MATLAB scripts and functions. Each script include discriptions that guide the user how to used it and how to find the dataset that used for processing.
MATLAB Main Scripts: Include the whole steps to process the data, output figures, and output videos.
Script_1_Ice_velocity_process_flow.m
Script_2_strain_rate_process_flow.m
Script_3_DROT_grounding_line_extraction.m
Script_4_Read_ICESat2_h5_files.m
Script_5_Extraction_results.m
MATLAB functions: Five Files that includes MATLAB functions that support the main script:
1_Ice_velocity_code: Include MATLAB functions related to ice velocity post-processing, includes remove outliers, filter, correct for atmospheric and tidal effect, inverse weited averaged, and error estimate.
2_strain_rate: Include MATLAB functions related to strain rate calculation.
3_DROT_extract_grounding_line_code: Include MATLAB functions related to convert range offset results output from GAMMA to differential vertical displacement and used the result extract grounding line.
4_Extract_data_from_2D_result: Include MATLAB functions that used for extract profiles from 2D data.
5_NeRD_Damage_detection: Modified code fom Izeboud et al. 2023. When apply this code please also cite Izeboud et al. 2023 (https://www.sciencedirect.com/science/article/pii/S0034425722004655).
6_Figure_plotting_code:Include MATLAB functions related to Figures in the paper and support information.
Director: data_and_result
Description:**Include directories that store the results output from MATLAB. user only neeed to modify the path in MATLAB script to their own path.
1_origin : Sample data ("PS-20180323-20180329", “PS-20180329-20180404”, “PS-20180404-20180410”) output from GAMMA software in Geotiff format that can be used to calculate DROT and velocity. Includes displacment, theta, phi, and ccp.
2_maskccpN: Remove outliers by ccp < 0.05 and change displacement to velocity (m/day).
3_rockpoint: Extract velocities at non-moving region
4_constant_detrend: removed orbit error
5_Tidal_correction: remove atmospheric and tidal induced error
6_rockpoint: Extract non-aggregated velocities at non-moving region
6_vx_vy_v: trasform velocities from va/vr to vx/vy
7_rockpoint: Extract aggregated velocities at non-moving region
7_vx_vy_v_aggregate_and_error_estimate: inverse weighted average of three ice velocity maps and calculate the error maps
8_strain_rate: calculated strain rate from aggregate ice velocity
9_compare: store the results before and after tidal correction and aggregation.
10_Block_result: times series results that extrac from 2D data.
11_MALAB_output_png_result: Store .png files and time serties result
12_DROT: Differential Range Offset Tracking results
13_ICESat_2: ICESat_2 .h5 files and .mat files can put here (in this file only include the samples from tracks 0965 and 1094)
14_MODIS_images: you can store MODIS images here
shp: grounding line, rock region, ice front, and other shape files.
File 2 : PIG_front_1947_2023.zip
Includes Ice front positions shape files from 1947 to 2023, which used for plotting figure.1 in the paper.
File 3 : PIG_DROT_GL_2016_2021.zip
Includes grounding line positions shape files from 1947 to 2023, which used for plotting figure.1 in the paper.
Data was derived from the following sources:
Those links can be found in MATLAB scripts or in the paper "**Open Research" **section.
Facebook
TwitterCC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
The files described below replicate the results of "Big G". They are divided into three parts, which can be found in three different sub-folders: (1) FiveFacts, (2) ModelSimulation, and (3) VAR. ************************************************************************************* ******************* PART 1: Five Facts on Government spending *********************** ************************************************************************************* Folder: FiveFacts This folder contains code to replicate Figures 1-4 and Tables 1-4 in Section 3 of the paper. _ Data Set-Up _ In order to run the included script files, the main dataset needs to be assembled. The data on federal procurement contracts used in this paper is all publicly available from USASpending.gov. The base dataset used for all of the empirical results in this paper consists of the universe of procurement contract transactions from 2001-2019---around 30 GB of data. Due to its size, the data requires a substantial amount of computing power to work with. Our approach was to load the data into a SQL database on a server, following the instructions provided by USASpending.gov, which can be found here: https://files.usaspending.gov/database_download/usaspending-db-setup.pdf. As a result, the replication code cannot feasibly start with the raw dataset, though we have provided the raw files at an annual basis at [INSERT URL FOR SITE HERE]. The files "setup_data_1.R", "setup_data_2.R", "setup_data_3.R", and "setup_data_4.R" pull from the SQL database and create intermediate files that are provided with this replication package. You will NOT be able to run the "set_up" files without setting up your own SQL database, but you CAN run the Figure and Table replication code (described below) using the intermediate files created in the setup files. _ Figures _ Figure 1 + Step 1: Run 'create_contract_proxy.R,' which creates a dataset called 'contracts_for_ramey_merge.dta' + Step 2: Run ramey_zubairy_replication.do, which is a file TAKEN DIRECTLY FROM THE REPLICATION PACKAGE for Ramey & Zubairy (JPE, 2018), found at the link below. We merge our dataset into theirs, and re-run their regressions on our data. Ramey & Zubairy (2018) replication: https://econweb.ucsd.edu/~vramey/research/Ramey_Zubairy_replication_codes.zip. Figure 2 + 'Figure_2a.R' produces Figure 2a using 'intermediate_file_1.RData' + 'Figure_2b.R' produces Figure 2b using 'intermediate_file_2.RData' Figure 3 + 'Figure_3a.R' produces Figure 3a using 'intermediate_file_3.RData' + 'Figure_3b.R' produces Figure 3b using 'intermediate_file_2.RData' Figure 4 + 'Figure_4.R' produces Figures 4a and 4b using 'intermediate_file_3.RData' _ Tables _ Table 1 + 'Table_1.do' produces Table 1 using 'contracts_for_ramey_merge.dta' Table 2 + 'Table_2_upper' produces the top portion of Table 2 using the 'sectors_unbalanced.dta' file created in 'setup_data_4.R' + 'Table_2_lower' produces the lower portion of Table 2 using the 'firms_unbalanced.dta' file created in 'setup_data_4.R' Table 3 + 'Table_3.R' produces Table 3 using 'intermediate_file_1.RData'. Table 4 + Components for Table 4 can be found in 'Figure_3a.R' and 'Figure_3b.R' (noted in those files). ************************************************************************************* ************************** PART 2: Model Simulation ********************************* ************************************************************************************* Folder: "ModelSimulation" + Matlab file MAIN_generateIRFs.m generates Figures 5 and 6 in the paper. It calls the mod file modelG.mod + Matlab file MAIN_generateIRFs_htm.m generates Figure A.21 in the Appendix. It calls the mod file modelG_htm.mod + Both files run on Dynare 5.4. ************************************************************************************* ******************************** PART 3: VAR **************************************** ************************************************************************************* Folder: "VAR" (see README in VAR folder for more detail). Data Setup: "setup_var_data.R," like the files in the FiveFacts folder, will not run. They create a dataset of contracts by month and naics2 sector from the SQL database. + 'VAR.do' runs the VAR that produces Figure 7.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset contains in-air hand-written numbers and shapes data used in the paper:B. Alwaely and C. Abhayaratne, "Graph Spectral Domain Feature Learning With Application to in-Air Hand-Drawn Number and Shape Recognition," in IEEE Access, vol. 7, pp. 159661-159673, 2019, doi: 10.1109/ACCESS.2019.2950643.The dataset contains the following:-Readme.txt- InAirNumberShapeDataset.zip containing-Number Folder (With 2 sub folders for Matlab and Excel)-Shapes Folder (With 2 sub folders for Matlab and Excel)The datasets include the in-air drawn number and shape hand movement path captured by a Kinect sensor. The number sub dataset includes 500 instances per each number 0 to 9, resulting in a total of 5000 number data instances. Similarly, the shape sub dataset also includes 500 instances per each shape for 10 different arbitrary 2D shapes, resulting in a total of 5000 shape instances. The dataset provides X, Y, Z coordinates of the hand movement path data in Matlab (M-file) and Excel formats and their corresponding labels.This dataset creation has received The University of Sheffield ethics approval under application #023005 granted on 19/10/2018.
Facebook
TwitterCC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
MATLAB code containing data aggregated from DRIAD output files located in "Replication Data for: DOI 10.1063/5.0075261," external datasets used for input into the present simulation, and relevant color schemes used to recreate plots used in the published paper.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Dataset in MATLAB (CSV and Python Formats version available here):
This dataset provides a high-resolution, well-annotated collection of vibration measurements from cylindrical roller bearings, both healthy and with artificially induced inner ring damage. It is designed to support machine learning research addressing domain shift by enabling robust evaluation of model generalization across realistic variations in rotational speed, applied load, and mounting position.
Unlike existing bearing datasets, this resource follows a structured experimental design with controlled covariates known to cause domain shifts. It includes 1,151 multi-axis recordings (20 kHz, 60 s) across multiple bearing instances, damage states, and operating conditions.
Optimized for Leave-One-Group-Out Cross-Validation (LOGOCV), the dataset facilitates rigorous assessment of model robustness to unseen conditions. It also includes:
Detailed metadata on testbed setup, damage geometry, and environmental parameters
Transparent labeling of assembly deviations for anomaly detection research
MATLAB scripts for streamlined data loading and segmentation
This dataset is particularly suited for work in robust ML, domain generalization, fault diagnosis, and industrial condition monitoring.
A detailed description of the data can be found at Data Descriptor.
This research was performed in the context of project VProSaar (“Verteilte Produktion für die saarländische Automotivindustrie: Nachhaltig, Vernetzt, Resilient ”) carried out at the Centre for Mechatronics and Automation Technology gGmbH and funded by the Ministry of Economic Affairs, Innovation, Digital and Energy (MWIDE) and the European Fonds for Regional Development (EFRE).
Facebook
TwitterThis resource contains the experimental data that was included in tecplot input files but in matlab files. dba1_cp has all the results is dimensioned (7,2) first dimension is 1-7 for each span station 2nd dimension is 1 for upper surface, 2 for lower surface. dba1_cp(ispan,isurf).x are the x/c locations at span station (ispan) and upper(isurf=1) or lower(isurf=2) dba1_cp(ispan,isurf).y are the eta locations at span station (ispan) and upper(isurf=1) or lower(isurf=2) dba1_cp(ispan,isurf).cp are the pressures at span station (ispan) and upper(isurf=1) or lower(isurf=2) Unsteady CP is dimensioned with 4 columns 1st column, real 2nd column, imaginary 3rd column, magnitude 4th column, phase, deg M,Re and other pertinent variables are included as variables and also included in casedata.M, etc