MIT Licensehttps://opensource.org/licenses/MIT
License information was derived automatically
Importance Matrix Calibration Datasets
This repository contains calibration datasets used to generate importance matrices (imatrix), which in turn help minimise errors introduced during quantization.
Math calibration datasets
This dataset consists of over 10M tokens of cleaned math prompts and is available in six sizes, ranging from huge (~ 430,000 lines equivalent to approx. 10M tokens), to micro (~ 13,700 lines and 1.7M tokens avg). Original data sourced from… See the full description on the dataset page: https://huggingface.co/datasets/eaddario/imatrix-calibration.
Nanocalorimeter calibration data. The file format is Origin Pro* project files, which include multiple data worksheets and derived graphs. *Any mention of commercial products is for information only; it does not imply recommendation or endorsement by NIST. Please cite the related paper "Practical Guide to the Design, Fabrication and Calibration of NIST Nanocalorimeters" by Feng Yi, Michael D. Grapes, and David A. LaVan in the Journal of Research of the National Institute of Standards and Technology, Volume 124 (in press).
MIT Licensehttps://opensource.org/licenses/MIT
License information was derived automatically
Post-hoc Calibration Dataset
This repository contains datasets designed for evaluating and developing post-hoc calibration methods for deep neural network classifiers. Each dataset includes precomputed logits and labels, divided into clear training and test splits.
Dataset Overview
Datasets provided here cover popular benchmark tasks, including CIFAR-10, CIFAR-100, SVHN, Stanford Cars (CARS), CUB-200 Birds (BIRDS), and ImageNet. Dataset composition here for post-hoc… See the full description on the dataset page: https://huggingface.co/datasets/WJHuang/calibration.
The MMR Calibration Data Set contains radiance data collected in the summer of 1987 and in July and August of 1989 via a Modular Multiband Radiometer (MMR) instrument. The MMR instrument monitored a nearly lambertian calibration panel stationed near the center of the FIFE study area. The radiances recorded from this instrument can be used to monitor solar insolation and clouds. In some cases, these data were also used to calculate the reflectance factor for reflective radiances measured over vegetation using other MMR instruments located at other FIFE sites or mounted on a helicopter.
rbiswasfc/eedi-awq-calibration dataset hosted on Hugging Face and contributed by the HF Datasets community
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
This is an example dataset recorded using version 1.0 of the open-source-hardware OpenAXES IMU. Please see the github repository for more information on the hardware and firmware. This dataset contains calibration sequences of four OpenAXES IMUs, which were performed using the icosahedral calibration fixture described in the OpenAXES repository and paper. For each IMU, five logs were recorded using the icosahedral calibration fixture according to the procedure described below. These can be found in the logs subdirectory. Additionally, the cuboid_logs directory contains two logs for each IMU in which the device was only placed on the flat faces of the 3D-printed case. As stated in the preprint, these are not useful for calibration and yield no usable calibration results. They are only provided for completeness. The openaxes_example_calibration_dataset subdirectory also contains the MATLAB scripts which were used to evaluate the dataset for [our preprint][3]. The script coefficients-from-mat-files.py can be used to extract the calibration coefficients from the .mat files. Evaluation
These datasets were generated for calibrating robot-camera systems. In an extension, we also considered the problem of calibrating robots with more than one camera. These datasets are provided as a companion to the paper "Solving the Robot-World Hand-Eye(s) Calibration Problem with Iterative Methods" by Amy Tabb and Khalil M. Ahmad Yousef. Included are eight datasets in zipped files, numbered DS1.zip, DS2.zip, etc. Explanations of the format of the datasets is provided in the README resource in the file "README_input_format.txt". Generally, each zipped folder consists of images and a text file of robot positions when those images were acquired. Open source code can be found at: https://github.com/amy-tabb/RWHEC-Tabb-AhmadYousef We also include the results of using our code on one of the datasets so that you can be sure that the code worked correctly. This folder is named DS1_write.zip and can be found in the resource titled "Output from running methods on Dataset 1". Problems/Comments/Bugs should be addressed to amy.tabb@ars.usda.gov Resources in this dataset:Resource Title: README. File Name: README_input_format.txt.txtResource Description: This file gives an in-depth description of the image and robot position datasets.Resource Title: Dataset 1. File Name: DS1.zipResource Title: Dataset 2. File Name: DS2.zipResource Title: Dataset 3. File Name: DS3.zipResource Title: Dataset 4. File Name: DS4.zipResource Title: Dataset 5. File Name: DS5.zipResource Title: Dataset 6. File Name: DS6.zipResource Title: Dataset 7. File Name: DS7.zipResource Title: Dataset 8. File Name: DS8.zipResource Title: Output from running methods on Datatset 1. File Name: DS1_write.zip
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This data contains calibration images captured by the Biomass Monitoring System developed for AquaVitae. The images were captured in the Autonomous Systems Testing Arena at DTU Lyngby Campus and at the North Sea Oceanarium. The data consists of synchronized images captured from two grayscale cameras and one color camera. The calibration results are produced from the Kalibr Visual-Inertial Calibration Toolbox.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
## Overview
Calibration Target is a dataset for object detection tasks - it contains Target annotations for 200 images.
## Getting Started
You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
## License
This dataset is available under the [CC BY 4.0 license](https://creativecommons.org/licenses/CC BY 4.0).
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset (v1) is designed for the development and evaluation of camera calibration algorithms, with a special focus on generic calibration. It contains real-world images captured by four cameras with progressively increasing lens distortion:
1. Sony A6400
2. GoPro in Linear mode
3. GoPro in Wide mode
4. Insta360
The dataset is organized into two distinct scenarios for each camera, corresponding to the following folder names:
1. fixed_camera_multi_pattern_poses: The camera remains stationary while a calibration pattern is captured at various poses.
2. fixed_pattern_multi_view: The calibration pattern is fixed, and the camera is moved to capture it from multiple views.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
MINTS Light Sensor Calibration Dataset,
Hanson Center for Space Sciences, University of Texas at Dallas, Richardson, TX 75080, USA
The dataset used for calibration of the rating transition model. It contains 1000 trajectories of the rating process Xt.
Two digital video cameras were temporarily installed at the U.S. Fish and Wildlife Service (USFWS) Pea Island National Wildlife Refuge (PINWR) in North Carolina (NC), as part of the DUring Nearshore Event eXperiment (DUNEX). DUNEX was a collaborative community-led experiment that took place in the fall of 2021 along the Outer Banks of NC, with the goal of improving the understanding, observational techniques, and predictive capabilities for extreme storm processes and impacts within the coastal environment. At the USFWS PINWR site, cameras were deployed for about a month, from September 18 to October 24, 2021, during which several storms passed offshore of the site. The cameras were mounted on separate 7-meter (m) tall masts within the dune, facing northeast and offshore, in a stereo configuration with approximately 75% overlap in field of view, to measure shoreline water levels and coincident topographic beach profiles. Images were collected during daylight hours with two schemes: 1) both cameras recording at 1 Hertz (Hz) for 5 minutes (min) starting 10 min before the hour for stereo photogrammetric processing to measure topographic beach profiles, and 2) one camera recording at 2 Hz for 17 min starting at the top of the hour for producing snapshots and time-averaged image products used to measure wave runup. This metadata record is for camera 2 and includes the necessary intrinsic orientation (IO) and extrinsic orientation (EO) calibration data to utilize the imagery to make quantitative measurements. The cameras are part of a U.S. Geological Survey (USGS) research project to study the beach and nearshore environment. USGS researchers utilize the imagery collected from these cameras to remotely sense a range of information including shoreline position, sandbar migration, wave runup on the beach, alongshore currents, and nearshore bathymetry. This camera is part of the USGS CoastCam network. To learn more about the DUNEX camera deployment visit https://www.usgs.gov/centers/whcmsc/science/dunex-pea-island-experiment.
Calibration data: CE fluxes.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
## Overview
Soccer Camera Calibration is a dataset for object detection tasks - it contains Key Points annotations for 317 images.
## Getting Started
You can download this dataset for use within your own projects, or fork it into a workspace on Roboflow to create your own model.
## License
This dataset is available under the [CC BY 4.0 license](https://creativecommons.org/licenses/CC BY 4.0).
The data in this repository were collected from the San Diego, California testbed, namely, I-15 from the interchange with SR-78 in the north to the interchange with SR-163 in the south, along the mainline and at the entrance ramps. This file contains information on the field observation and simulation results for speed profile from the Dallas, Texas testbed. The time reported for the speed profiles are between 2:00PM to 8:00PM in increments of 10 minutes.
Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
License information was derived automatically
dataset.txt: https://gist.github.com/bartowski1182/eb213dccb3571f863da82e99418f81e8 + skt/kobest_v1 (boolq train) dataset2.txt: https://gist.github.com/tristandruyen/9e207a95c7d75ddf37525d353e00659c + skt/kobest_v1 (boolq train)
Attribution-NonCommercial 4.0 (CC BY-NC 4.0)https://creativecommons.org/licenses/by-nc/4.0/
License information was derived automatically
This data set contains line scan data acquired from a micro epsilon laser scanner, under default settings, that is mounted to an industrial robot arm. The robot is a Fanuc LR Mate 200 iC industrial robot arm, driven by a R-30/iA Mate controller. The scanner is a Micro Epsilon 3D profile sensor (sensor model: Epsilon scan control 2900-50). To acquire the data the laser scanner is positioned relative to a flat target plate, and laser scans are collected by using the robot to position the laser scanner at a range of poses relative to the plate. The data set also includes a definition of the pose of the robot tool center point. The data can be used to investigate strategies for robot hand-eye calibration of laser scanners using only data collected from flat target surfaces.
Attribution-NonCommercial 3.0 (CC BY-NC 3.0)https://creativecommons.org/licenses/by-nc/3.0/
License information was derived automatically
The dataset is a collection of images, 2D image points, and 3D points. It serves to calibrate a multi-camera robot system for high-precision measurements of large industrial components (more than 1m x 1m). In particular, the system uses a 3D scanner mounted on a robot manipulator and multiple cameras attached to an external frame outside of the robot work zone. To achieve a measurement resolution at the level of tens of micrometers, the 3D scanner was designed with a small working area of 100mm x 100mm. Consequently, to measure objects larger than the scanner's working space, a method for locating the moving scanner in 3D space was required. This approach enabled the stitching of measurements from multiple scanner positions, thereby reconstructing the complete geometry of large industrial components. For this purpose, the 3D scanner was covered with tailored markers to facilitate precise localization of the scanner within the multi-camera system. To this end, it is necessary to calibrate the cameras internally (internal calibration determining the optical parameters of the camera with the lens) and to calibrate the entire vision system externally (external calibration determining the position and rotation of the cameras relative to each other). Two calibration methods have been developed:
Dot plate method – internal calibration Vision target and Laser Tracker method – external calibration Moreover, another calibration method was developed that allowed for simultaneous external and internal calibration of the vision system:
Single dot and CMM (Coordinate-Measuring Machine) method – internal and external calibration Finally, to be able to use the information about the localized markers to determine the position and rotation of the 3D scanner with respect to the cameras coordinate system, tool calibration is also necessary. Tool calibration consists in finding the position of all markers on the scanner in its internal coordinate system. A method for tool calibration was developed, which consisted in scanning a rigidly mounted sphere in space from different rotations and positions of the scanner. Having information about the position of the scanner relative to the sphere and the position of the markers, we are able to determine scanner’s motion in 3D space for stitching all measurements of the industrial object.
Acknowledgments: This dataset was developed as part of the project titled "Development of an optical technology for spatial positioning and navigation in measurement applications for industry based on the positioning of passive markers in visible light" (grant agreement no. POIR.01.01.01-00-D004/16-00) funded by the National Centre for Research and Development, Poland.
MIT Licensehttps://opensource.org/licenses/MIT
License information was derived automatically
Importance Matrix Calibration Datasets
This repository contains calibration datasets used to generate importance matrices (imatrix), which in turn help minimise errors introduced during quantization.
Math calibration datasets
This dataset consists of over 10M tokens of cleaned math prompts and is available in six sizes, ranging from huge (~ 430,000 lines equivalent to approx. 10M tokens), to micro (~ 13,700 lines and 1.7M tokens avg). Original data sourced from… See the full description on the dataset page: https://huggingface.co/datasets/eaddario/imatrix-calibration.