53 datasets found
  1. Keypoint decription for planetary environments

    • kaggle.com
    zip
    Updated Sep 23, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    George Petrakis (2023). Keypoint decription for planetary environments [Dataset]. https://www.kaggle.com/datasets/georgepetrakis/unstructured-and-planetary-environments
    Explore at:
    zip(1577493688 bytes)Available download formats
    Dataset updated
    Sep 23, 2023
    Authors
    George Petrakis
    License

    https://creativecommons.org/publicdomain/zero/1.0/https://creativecommons.org/publicdomain/zero/1.0/

    Description

    This training dataset is focused on deep neural networks which are focused on keypoint detection and description such as SuperPoint (DeTone et al. 2018). The dataset was designed for unstructured and planetary scenes aiming to train learning-based keypoint detectors and descriptors. The dataset contains about 48,000 of images from Earth, Mars and Moon while all the images have been converted in grayscale with a size of 320x240.

    The original images from Earth were captured in a quarry and construction sites in Greece while the images from Mars were collected by a publicly available dataset of NASA. The moon images are artificial rover-based images which were generated and released with CC (Creative Commons) license by Keio University in Japan.

    The dataset will further enriched soon.

    References: DeTone D, Malisiewicz T, Rabinovich A, SuperPoint: Self-Supervised Interest Point Detection and Description, 2018, arXiv:1712.07629

  2. SchoolCalendar

    • figshare.com
    • search.datacite.org
    xls
    Updated Jan 19, 2016
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alberto Gomez (2016). SchoolCalendar [Dataset]. http://doi.org/10.6084/m9.figshare.1164280.v5
    Explore at:
    xlsAvailable download formats
    Dataset updated
    Jan 19, 2016
    Dataset provided by
    Figsharehttp://figshare.com/
    figshare
    Authors
    Alberto Gomez
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Perpetual calendar which gives the GMT times of lunar phases, equinoxes, solstices, and some other calendaric curiosities of any year.

  3. e

    Global and Planetary Change - if-computation

    • exaly.com
    csv, json
    Updated Nov 1, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). Global and Planetary Change - if-computation [Dataset]. https://exaly.com/journal/16370/global-and-planetary-change/impact-factor
    Explore at:
    json, csvAvailable download formats
    Dataset updated
    Nov 1, 2025
    License

    Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
    License information was derived automatically

    Description

    This graph shows how the impact factor of ^ is computed. The left axis depicts the number of papers published in years X-1 and X-2, and the right axis displays their citations in year X.

  4. m

    3BPPCanCoord : a software package designed to produce the expansions of the...

    • data.mendeley.com
    Updated Nov 19, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ugo Locatelli (2025). 3BPPCanCoord : a software package designed to produce the expansions of the Three–Body Planetary Problem in Poincaré Canonical Coordinates [Dataset]. http://doi.org/10.17632/m4d5hd8wvn.1
    Explore at:
    Dataset updated
    Nov 19, 2025
    Authors
    Ugo Locatelli
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This software package is designed to achieve a classical goal in Celestial Mechanics, i.e., computing the expansions of the Three–Body Planetary Problem in a set of canonical coordinates introduced by Poincaré, which are very suitable for further developments of Hamiltonian perturbation theory, commonly based on a normal form approach. The codes included in the (compressed ) file 3BPPCanCoord.zip should properly run on any modern computer equipped with a reasonable amount of RAM and having onboard a C compiler and the Wolfram Mathematica software.

  5. Planetary Motion and Atmosphere Dataset

    • kaggle.com
    Updated May 28, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Himel Sarder (2025). Planetary Motion and Atmosphere Dataset [Dataset]. https://www.kaggle.com/datasets/himelsarder/synthetic-planet-observation-data
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    May 28, 2025
    Dataset provided by
    Kagglehttp://kaggle.com/
    Authors
    Himel Sarder
    License

    https://creativecommons.org/publicdomain/zero/1.0/https://creativecommons.org/publicdomain/zero/1.0/

    Description

    Sure! Here's a Kaggle-ready dataset description based on the structure and columns you've provided. I've assumed this is simulated or experimental data, but I can adjust it if it's from real-world observations.

    Dataset Description

    This dataset provides a collection of key physical and environmental attributes typically associated with orbiting celestial bodies, such as satellites or planets. Each row represents a unique observation or simulation point characterized by four main features:

    Features:

    • Orbital Speed (km/s): The velocity at which the object moves along its orbital path. Higher speeds are typically associated with closer proximity to the central body due to stronger gravitational pull.

    • Surface Temperature (°C): The average temperature measured or simulated at the surface level of the object. This can vary significantly based on distance from the central star, atmospheric conditions, and surface composition.

    • Atmospheric Density (kg/m³): Represents the density of the atmosphere, which influences thermal regulation, drag (for satellites), and the potential for habitability in the case of planetary bodies.

    • Reflectivity Index (Albedo): A normalized value indicating how much incoming light is reflected by the surface. A higher value suggests a more reflective surface (e.g., ice), while lower values suggest absorption (e.g., rock, ocean).

  6. d

    Moon Wreckers Planetary Rover Driving Status Anomaly Detection Dataset

    • search.dataone.org
    • dataverse.harvard.edu
    Updated Nov 22, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Qiu, Dicong (2023). Moon Wreckers Planetary Rover Driving Status Anomaly Detection Dataset [Dataset]. http://doi.org/10.7910/DVN/UD54YB
    Explore at:
    Dataset updated
    Nov 22, 2023
    Dataset provided by
    Harvard Dataverse
    Authors
    Qiu, Dicong
    Description

    The Planetary Rover Driving Status Anomaly Detection Dataset is generated and provided by the Moon Wreckers Team from Carnegie Mellon University Robotics Institute in the Roverside Assistance Project. This dataset includes 5 ROS bag files collected from the AK1 rover in different status and a ROS bag file collected from the AK2 rover in the stopped status. Topic messages exported into .csv format files are also provided, including (1) the raw rover odometry estimated from wheel encoders, (2) the filtered rover odometry from both wheel encoders and IMUs, and (3) the odometry from VIVE trackers.

  7. e

    Planetary and Space Science - if-computation

    • exaly.com
    csv, json
    Updated Nov 1, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). Planetary and Space Science - if-computation [Dataset]. https://exaly.com/journal/13360/planetary-and-space-science/impact-factor
    Explore at:
    csv, jsonAvailable download formats
    Dataset updated
    Nov 1, 2025
    License

    Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
    License information was derived automatically

    Description

    This graph shows how the impact factor of ^ is computed. The left axis depicts the number of papers published in years X-1 and X-2, and the right axis displays their citations in year X.

  8. Solar System Planet Recognition with OpenCV

    • kaggle.com
    zip
    Updated Jun 14, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    𝐁𝐮𝐫𝐚𝐤 Kurt (2025). Solar System Planet Recognition with OpenCV [Dataset]. https://www.kaggle.com/datasets/burakkurt07/solar-system-planet-recognition-with-opencv
    Explore at:
    zip(258183520 bytes)Available download formats
    Dataset updated
    Jun 14, 2025
    Authors
    𝐁𝐮𝐫𝐚𝐤 Kurt
    License

    Apache License, v2.0https://www.apache.org/licenses/LICENSE-2.0
    License information was derived automatically

    Description

    This project aims to develop a model that recognizes Solar System planets using OpenCV and machine learning techniques. The project provides a basic application in the fields of image processing and object recognition.

    Bu proje, OpenCV ve makine öğrenmesi teknikleri kullanarak Güneş Sistemi gezegenlerini tanıyan bir model geliştirmeyi amaçlamaktadır. Proje, görüntü işleme ve nesne tanıma alanlarında temel bir uygulama sunmaktadır.

  9. Simultaneous Localization and Mapping for Planetary Surface Mobility, Phase...

    • data.nasa.gov
    application/rdfxml +5
    Updated Jun 26, 2018
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2018). Simultaneous Localization and Mapping for Planetary Surface Mobility, Phase I [Dataset]. https://data.nasa.gov/dataset/Simultaneous-Localization-and-Mapping-for-Planetar/9tfg-bund
    Explore at:
    json, xml, csv, application/rdfxml, tsv, application/rssxmlAvailable download formats
    Dataset updated
    Jun 26, 2018
    License

    U.S. Government Workshttps://www.usa.gov/government-works
    License information was derived automatically

    Description

    ProtoInnovations, LLC and Carnegie Mellon University have formed a partnership to commercially develop localization and mapping technologies for planetary rovers. Our first aim is to provide a reliable means of localization that is independent of infrastructure, such as GPS, and compatible with requirements of missions to planetary surfaces. Simultaneously solving for the precise location of the rover as it moves while building an accurate map of the environment is an optimization problem involving internal sensing, sensing of the surrounding environment, probabilistic optimization methods, efficient data structures, and a robust implementation. Our second aim is to merge simultaneous localization and mapping (SLAM) technologies with our existing Reliable Autonomous Surface Mobility (RASM) architecture for rover navigation. Our unique partnership brings together state-of-the-art technologies for SLAM with experience in delivering and supporting both autonomous systems and mobility platforms for NASA.

    Our proposed project will create a SLAM framework that is capable of accurately localizing a rover throughout long, multi-kilometer traverses of barren terrain. Our approach is compatible with limited communication and computing resources expected for missions to planetary surfaces. Our technology is based on innovative representations of evidence grids, particle-filter algorithms that operate on range data rather than explicit features, and strategies for segmenting large evidence grids into manageable pieces.

    In this project we will evaluate the maturity of these algorithms, developed for research programs at Carnegie Mellon, and incorporate them into our RASM architecture, thus providing portable and reliable localization for a variety of vehicle platforms and sensors. Mission constraints will vary broadly, so our SLAM components will be able to merge readings from various suites of sensors that may be found on planetary rovers.

  10. Data from: iSALE-Dellen manual

    • figshare.com
    pdf
    Updated Jul 13, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Gareth S. Collins; Dirk Elbeshausen; Thomas M. Davison; Kai Wünnemann; Boris Ivanov; H. Jay Melosh (2023). iSALE-Dellen manual [Dataset]. http://doi.org/10.6084/m9.figshare.3473690.v1
    Explore at:
    pdfAvailable download formats
    Dataset updated
    Jul 13, 2023
    Dataset provided by
    figshare
    Figsharehttp://figshare.com/
    Authors
    Gareth S. Collins; Dirk Elbeshausen; Thomas M. Davison; Kai Wünnemann; Boris Ivanov; H. Jay Melosh
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Manual for the Dellen release of the iSALE shock physics code:

    A multi-material, multi-rheology shock physics code for simulating impact phenomena in two and three dimensions.

  11. m

    Data from: Water saturation in texturally porous carbonate rocks: Shock...

    • data.mendeley.com
    Updated Oct 15, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Juulia-Gabrielle Moreau (2025). Water saturation in texturally porous carbonate rocks: Shock thermodynamics and dampening of the shock. [Dataset]. http://doi.org/10.17632/85ndv677mj.1
    Explore at:
    Dataset updated
    Oct 15, 2025
    Authors
    Juulia-Gabrielle Moreau
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    iSALE configuration files, collected data, and plotting means for the mesoscale models

  12. l

    NASA DEM

    • kenya.lsc-hubs.org
    • rwanda.lsc-hubs.org
    Updated Feb 5, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2024). NASA DEM [Dataset]. https://kenya.lsc-hubs.org/cat/collections/metadata:main/items/dea_nasadem
    Explore at:
    Dataset updated
    Feb 5, 2024
    Description

    NASADEM from Microsoft's Planetary Computer

  13. Sentinel-2 10m Land Use/Land Cover Time Series

    • digital-earth-pacificcore.hub.arcgis.com
    • caribbeangeoportal.com
    • +10more
    Updated Oct 19, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Esri (2022). Sentinel-2 10m Land Use/Land Cover Time Series [Dataset]. https://digital-earth-pacificcore.hub.arcgis.com/datasets/cfcb7609de5f478eb7666240902d4d3d
    Explore at:
    Dataset updated
    Oct 19, 2022
    Dataset authored and provided by
    Esrihttp://esri.com/
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Description

    This layer displays a global map of land use/land cover (LULC) derived from ESA Sentinel-2 imagery at 10m resolution. Each year is generated with Impact Observatory’s deep learning AI land classification model, trained using billions of human-labeled image pixels from the National Geographic Society. The global maps are produced by applying this model to the Sentinel-2 Level-2A image collection on Microsoft’s Planetary Computer, processing over 400,000 Earth observations per year.The algorithm generates LULC predictions for nine classes, described in detail below. The year 2017 has a land cover class assigned for every pixel, but its class is based upon fewer images than the other years. The years 2018-2024 are based upon a more complete set of imagery. For this reason, the year 2017 may have less accurate land cover class assignments than the years 2018-2024. Key Properties Variable mapped: Land use/land cover in 2017, 2018, 2019, 2020, 2021, 2022, 2023, 2024Source Data Coordinate System: Universal Transverse Mercator (UTM) WGS84Service Coordinate System: Web Mercator Auxiliary Sphere WGS84 (EPSG:3857)Extent: GlobalSource imagery: Sentinel-2 L2ACell Size: 10-metersType: ThematicAttribution: Esri, Impact ObservatoryAnalysis: Optimized for analysisClass Definitions: ValueNameDescription1WaterAreas where water was predominantly present throughout the year; may not cover areas with sporadic or ephemeral water; contains little to no sparse vegetation, no rock outcrop nor built up features like docks; examples: rivers, ponds, lakes, oceans, flooded salt plains.2TreesAny significant clustering of tall (~15 feet or higher) dense vegetation, typically with a closed or dense canopy; examples: wooded vegetation, clusters of dense tall vegetation within savannas, plantations, swamp or mangroves (dense/tall vegetation with ephemeral water or canopy too thick to detect water underneath).4Flooded vegetationAreas of any type of vegetation with obvious intermixing of water throughout a majority of the year; seasonally flooded area that is a mix of grass/shrub/trees/bare ground; examples: flooded mangroves, emergent vegetation, rice paddies and other heavily irrigated and inundated agriculture.5CropsHuman planted/plotted cereals, grasses, and crops not at tree height; examples: corn, wheat, soy, fallow plots of structured land.7Built AreaHuman made structures; major road and rail networks; large homogenous impervious surfaces including parking structures, office buildings and residential housing; examples: houses, dense villages / towns / cities, paved roads, asphalt.8Bare groundAreas of rock or soil with very sparse to no vegetation for the entire year; large areas of sand and deserts with no to little vegetation; examples: exposed rock or soil, desert and sand dunes, dry salt flats/pans, dried lake beds, mines.9Snow/IceLarge homogenous areas of permanent snow or ice, typically only in mountain areas or highest latitudes; examples: glaciers, permanent snowpack, snow fields.10CloudsNo land cover information due to persistent cloud cover.11RangelandOpen areas covered in homogenous grasses with little to no taller vegetation; wild cereals and grasses with no obvious human plotting (i.e., not a plotted field); examples: natural meadows and fields with sparse to no tree cover, open savanna with few to no trees, parks/golf courses/lawns, pastures. Mix of small clusters of plants or single plants dispersed on a landscape that shows exposed soil or rock; scrub-filled clearings within dense forests that are clearly not taller than trees; examples: moderate to sparse cover of bushes, shrubs and tufts of grass, savannas with very sparse grasses, trees or other plants.NOTE: Land use focus does not provide the spatial detail of a land cover map. As such, for the built area classification, yards, parks, and groves will appear as built area rather than trees or rangeland classes.Usage Information and Best PracticesProcessing TemplatesThis layer includes a number of preconfigured processing templates (raster function templates) to provide on-the-fly data rendering and class isolation for visualization and analysis. Each processing template includes labels and descriptions to characterize the intended usage. This may include for visualization, for analysis, or for both visualization and analysis. VisualizationThe default rendering on this layer displays all classes.There are a number of on-the-fly renderings/processing templates designed specifically for data visualization.By default, the most recent year is displayed. To discover and isolate specific years for visualization in Map Viewer, try using the Image Collection Explorer. AnalysisIn order to leverage the optimization for analysis, the capability must be enabled by your ArcGIS organization administrator. More information on enabling this feature can be found in the ‘Regional data hosting’ section of this help doc.Optimized for analysis means this layer does not have size constraints for analysis and it is recommended for multisource analysis with other layers optimized for analysis. See this group for a complete list of imagery layers optimized for analysis.Prior to running analysis, users should always provide some form of data selection with either a layer filter (e.g. for a specific date range, cloud cover percent, mission, etc.) or by selecting specific images. To discover and isolate specific images for analysis in Map Viewer, try using the Image Collection Explorer.Zonal Statistics is a common tool used for understanding the composition of a specified area by reporting the total estimates for each of the classes. GeneralIf you are new to Sentinel-2 LULC, the Sentinel-2 Land Cover Explorer provides a good introductory user experience for working with this imagery layer. For more information, see this Quick Start Guide.Global land use/land cover maps provide information on conservation planning, food security, and hydrologic modeling, among other things. This dataset can be used to visualize land use/land cover anywhere on Earth. Classification ProcessThese maps include Version 003 of the global Sentinel-2 land use/land cover data product. It is produced by a deep learning model trained using over five billion hand-labeled Sentinel-2 pixels, sampled from over 20,000 sites distributed across all major biomes of the world.The underlying deep learning model uses 6-bands of Sentinel-2 L2A surface reflectance data: visible blue, green, red, near infrared, and two shortwave infrared bands. To create the final map, the model is run on multiple dates of imagery throughout the year, and the outputs are composited into a final representative map for each year.The input Sentinel-2 L2A data was accessed via Microsoft’s Planetary Computer and scaled using Microsoft Azure Batch. CitationKarra, Kontgis, et al. “Global land use/land cover with Sentinel-2 and deep learning.” IGARSS 2021-2021 IEEE International Geoscience and Remote Sensing Symposium. IEEE, 2021.AcknowledgementsTraining data for this project makes use of the National Geographic Society Dynamic World training dataset, produced for the Dynamic World Project by National Geographic Society in partnership with Google and the World Resources Institute.

  14. D

    Planetary Rover Autonomy Software Market Research Report 2033

    • dataintelo.com
    csv, pdf, pptx
    Updated Sep 30, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dataintelo (2025). Planetary Rover Autonomy Software Market Research Report 2033 [Dataset]. https://dataintelo.com/report/planetary-rover-autonomy-software-market
    Explore at:
    csv, pptx, pdfAvailable download formats
    Dataset updated
    Sep 30, 2025
    Dataset authored and provided by
    Dataintelo
    License

    https://dataintelo.com/privacy-and-policyhttps://dataintelo.com/privacy-and-policy

    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    Planetary Rover Autonomy Software Market Outlook



    According to our latest research, the global planetary rover autonomy software market size reached USD 1.14 billion in 2024, supported by rapid advancements in autonomous navigation, artificial intelligence, and the expanding scope of planetary exploration missions. The market is expected to grow at a robust CAGR of 13.7% from 2025 to 2033, reaching a forecasted value of USD 3.49 billion by 2033. This significant growth is driven by increased investments from government space agencies, the emergence of commercial space exploration initiatives, and the rising demand for advanced software solutions that enable safe, efficient, and autonomous operations of planetary rovers in challenging extraterrestrial environments.




    One of the primary growth factors for the planetary rover autonomy software market is the surge in interplanetary missions by major space agencies such as NASA, ESA, and CNSA. These agencies are increasingly deploying advanced rovers for lunar and Martian exploration, which necessitates highly reliable autonomy software to ensure mission success. The complexity of extraterrestrial terrains and the communication delays inherent in deep space missions require sophisticated software capable of real-time decision-making, obstacle avoidance, and adaptive path planning. The integration of machine learning and computer vision technologies into rover autonomy systems further enhances their ability to interpret sensor data, adapt to unforeseen challenges, and maximize scientific output, thus fueling market growth.




    Another key driver is the growing participation of commercial space companies, such as SpaceX, Blue Origin, and ispace, in planetary exploration. These companies are not only collaborating with governmental agencies but are also initiating independent missions, thereby expanding the market for autonomy software. The privatization of space exploration has accelerated innovation cycles, reduced costs, and introduced new business models, including lunar resource extraction and satellite servicing. As commercial entities strive for operational efficiency and mission safety, the demand for robust, scalable, and customizable autonomy software solutions continues to rise. This trend is expected to further diversify the market landscape and encourage the development of software tailored for a wide variety of rover platforms and mission profiles.




    The evolution of autonomy levels—from supervised and semi-autonomous to fully autonomous operations—also contributes significantly to market expansion. Advances in artificial intelligence, sensor fusion, and edge computing enable rovers to operate with minimal human intervention, even in unpredictable and hazardous environments. This transition not only enhances mission success rates but also reduces operational costs and the need for continuous ground-based supervision. The growing emphasis on autonomy is particularly evident in planned missions to the Moon, Mars, and asteroids, where communication latency makes real-time remote control impractical. As a result, investments in research and development of next-generation autonomy software are expected to remain strong, propelling the market forward.




    From a regional perspective, North America currently dominates the planetary rover autonomy software market, accounting for over 45% of the global revenue in 2024. This leadership is attributed to the presence of leading space agencies, a robust ecosystem of aerospace technology companies, and significant government funding for space exploration. Europe follows closely, supported by collaborative missions and strong research infrastructure. Meanwhile, the Asia Pacific region is rapidly emerging as a key growth market, driven by increased investments from China, India, and Japan in lunar and planetary exploration initiatives. These regional dynamics are expected to shape the competitive landscape and innovation trajectory of the market in the coming years.



    Component Analysis



    The component segment of the planetary rover autonomy software market is divided into software, hardware, and services, each playing a crucial role in enabling autonomous rover operations. The software segment leads the market, accounting for the largest share in 2024, as it forms the core of rover intelligence—encompassing navigation algorithms, sensor fusion, data interpretation, and mission planning. The increasing sophi

  15. h

    satclip

    • huggingface.co
    Updated Dec 15, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Daniel van Strien (2023). satclip [Dataset]. https://huggingface.co/datasets/davanstrien/satclip
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Dec 15, 2023
    Authors
    Daniel van Strien
    Description

    Dataset Card for S2-100K

    The S2-100K dataset is a dataset of 100,000 multi-spectral satellite images sampled from Sentinel-2 via the Microsoft Planetary Computer. Copernicus Sentinel data is captured between Jan 1, 2021 and May 17, 2023. The dataset is sampled approximately uniformly over landmass and only includes images without cloud coverage. The dataset is available for research purposes only. If you use the dataset, please cite our paper. More information on the dataset can… See the full description on the dataset page: https://huggingface.co/datasets/davanstrien/satclip.

  16. w

    Autonomous Vehicles with High Capacity Computational Power Used in Remote...

    • data.wu.ac.at
    • data.nasa.gov
    xml
    Updated Sep 16, 2017
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    National Aeronautics and Space Administration (2017). Autonomous Vehicles with High Capacity Computational Power Used in Remote Sensing Applications [Dataset]. https://data.wu.ac.at/schema/data_gov/MWZhNzgwNGUtZDQ5Ni00NmVhLThmZmItOGUwOTk3OTI3ZWJk
    Explore at:
    xmlAvailable download formats
    Dataset updated
    Sep 16, 2017
    Dataset provided by
    National Aeronautics and Space Administration
    License

    U.S. Government Workshttps://www.usa.gov/government-works
    License information was derived automatically

    Description

    The Unmanned Aerial System (UAS) industry in the United States is still very much in its infancy, but its potential impacts on the geospatial mapping and surveying professions are indisputable.

    In future years, requirements for imaging and remote-sensing observations with semi-autonomous operations Unmanned Autonomous Vehicle (UAV) will be key requirements for surveys of other planetary atmospheres and surfaces. In anticipation of these requirements, it is imperative that new technologies with increased automation capability, speed, and accuracy that can be achieved during a single mission are developed, evaluated and implemented.

    For this project, a prototype autonomous rover system that provides a framework to collect planetary remotely sensed data and leverage cloud computing services to produce environmental mapping products with that data, was developed and tested.

    This innovative technology could potentially support a wide variety of planetary data gathering science missions, while at the same time, offer the flexibility to incorporate additional new techniques that could eventually be applied to swarm rovers that integrate planetary aerial and surface access systems. Additionally, this technology could potentially be used to address SSC related facility monitoring and security issues; such as buffer zone intrusions, and provide support for rapid response capability for both natural and manmade disasters.

    In military operations, large remotely piloted UAVs have been successfully deployed for several years. The success in this application has spawned a new area of research - micro-autonomous aerial vehicles (micro-AAVs). Over the past two years, this research area has been exploited by universities, and has resulted in a rich collection of micro-AAVs platforms which range from the small, open-platform system using open source waypoint navigation software; to small, production ready, commercial-off-the-shelf platforms with complex highly intelligent flight management systems. These platforms are capable of supporting a full array of sensors and cameras ranging from high-resolution, true-color, still images to high-resolution real-time video streams. In addition, some platforms are capable of supporting near infrared (NIR) cameras that can be used for Normalized Difference Vegetation Index (NDVI) data products useful for vegetation health monitoring similar to those generated today by our team using Moderate

    Resolution Imaging Spectroradiometer (MODIS) satellite data.

    Additionally, for over a decade now, rovers have been successfully used on Mars to collect terrestrial close-up imagery and other sensor data. For future lunar and planetary exploratory missions, the development of smaller and more efficient micro-rover platforms have been proposed, and have been prototyped in a variety of forms and locomotive means. For successful and safe exploration of these surfaces, ultra-high resolution terrain and feature data, as well as, a flexible autonomous system to gather and process this data over wide areas will be required.

    For this project, the potential of simulating a rover-balloon tethered system, autonomous cloud enabled system, for gathering and processing low altitude high resolution imagery for the purposes of terrain model and thematic data product creation was explored, and demonstrated. The tablet cameras and sensors were used as a proxy for the AAV sensor and image data. A typical limiting factor associated with the small payload of these systems (micro-AAVs) is the computational power that can be deployed on them, which, correspondingly, limits their autonomous capabilities. To increase computational capacity, data was pushed to a cloud location for access by the processing system. Therefore, this project explored using cloud computing to increase its computational capacity on a tablet.

    The tablet and commercial off the shelf (COTS) smartphone with camera was able to establish communication with the cloud by tethering to a tablet mobile Wi-Fi hotspot for internet access. The tablet allowed for real-time data processing, analysis, and autonomous flight operations based on those observations.

    Therefore, for this project, the effective computational power of these platforms was increased by simulating cloud computing services via a local virtual machine data processing system. Using this Virtual Machine to establish communication with the cloud, the computational capacity of the simulated micro-AAV was augmented and enabled real-time data processing and analysis based on those observations.

    Future testing of this data processing flow via a virtual machine could be directly translated to current cloud computing services with little modification, and once implemented could enhance available UAV aerial rapid response platforms capabilities in their ability to respond to natural or manmade disasters.

  17. G

    Planetary Rover Autonomy Software Market Research Report 2033

    • growthmarketreports.com
    csv, pdf, pptx
    Updated Oct 6, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Growth Market Reports (2025). Planetary Rover Autonomy Software Market Research Report 2033 [Dataset]. https://growthmarketreports.com/report/planetary-rover-autonomy-software-market
    Explore at:
    pdf, csv, pptxAvailable download formats
    Dataset updated
    Oct 6, 2025
    Dataset authored and provided by
    Growth Market Reports
    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    Planetary Rover Autonomy Software Market Outlook



    As per our latest research, the planetary rover autonomy software market size reached USD 1.42 billion in 2024, demonstrating robust expansion driven by technological advancements and increased investments in space exploration. The market is expected to grow at a CAGR of 13.7% from 2025 to 2033, projecting a value of USD 4.15 billion by 2033. This significant growth is primarily fueled by the rising demand for autonomous navigation and data processing capabilities in planetary exploration missions, as agencies and private companies prioritize efficiency, safety, and mission success in increasingly complex extraterrestrial environments.




    The primary growth driver for the planetary rover autonomy software market is the accelerating pace of global space exploration initiatives. With both governmental agencies and commercial entities launching more frequent and ambitious missions to the Moon, Mars, and beyond, the need for reliable and sophisticated autonomy software has intensified. These software solutions enable rovers to make real-time decisions, avoid obstacles, and perform scientific tasks with minimal human intervention, which is crucial for missions where communication delays or hazardous terrains are prevalent. The integration of artificial intelligence, machine learning, and advanced sensor fusion technologies has further enhanced the capabilities of autonomy software, making planetary exploration safer, more efficient, and cost-effective. The proliferation of international collaborations, such as the Artemis program and Mars Sample Return missions, also contributes to the market’s momentum by fostering innovation and resource sharing.




    Another pivotal factor propelling market growth is the expanding role of commercial space companies in planetary exploration. The entry of private players like SpaceX, Blue Origin, and Astrobotic has injected significant capital and technical expertise into the sector, accelerating the development and deployment of next-generation autonomous rovers. These companies are not only supporting governmental missions but are also pursuing their own exploration agendas, including lunar mining and asteroid prospecting. As competition intensifies, there is a growing emphasis on developing modular, scalable, and interoperable autonomy software platforms that can be adapted to various mission profiles and environmental conditions. Furthermore, the increasing adoption of cloud-based and remote software deployment models allows for real-time updates, diagnostics, and mission control, further enhancing operational flexibility and reducing downtime.




    The surge in scientific research and technological innovation is also a crucial catalyst for the planetary rover autonomy software market. Research institutes and academic organizations are collaborating with space agencies to develop advanced algorithms for autonomous navigation, terrain mapping, and sample collection. These collaborations are resulting in breakthroughs such as cooperative multi-rover operations, swarm robotics, and self-healing software architectures, which are expected to become standard features in future missions. Additionally, the miniaturization of hardware and the advent of low-cost, high-performance computing platforms are enabling the deployment of sophisticated autonomy solutions on smaller, more agile rover platforms, democratizing access to planetary exploration for emerging space nations and smaller research teams.




    From a regional perspective, North America continues to dominate the planetary rover autonomy software market, accounting for the largest share in 2024, followed closely by Europe and Asia Pacific. The United States, with its well-established space infrastructure and significant investments from NASA and private companies, remains at the forefront of innovation and deployment. Europe’s market is buoyed by the European Space Agency’s (ESA) robust exploration agenda and growing participation from member states in lunar and Martian missions. Meanwhile, Asia Pacific is emerging as a significant growth hub, with China, India, and Japan making substantial strides in planetary exploration and autonomy research. These regional dynamics are fostering a competitive yet collaborative environment that is driving the market’s overall expansion.



  18. DeepLandforms: A Deep Learning Computer Vision toolset applied to a prime...

    • data.europa.eu
    unknown
    Updated Nov 22, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Zenodo (2022). DeepLandforms: A Deep Learning Computer Vision toolset applied to a prime use case for mapping planetary skylights - ANc [Dataset]. https://data.europa.eu/data/datasets/oai-zenodo-org-5734912?locale=mt
    Explore at:
    unknown(58085309)Available download formats
    Dataset updated
    Nov 22, 2022
    Dataset authored and provided by
    Zenodohttp://zenodo.org/
    Description

    Initial training dataset for DeepLandforms: A Deep Learning Computer Vision toolset applied to a prime use case for mapping planetary skylights

  19. e

    Meteoritics and Planetary Science - if-computation

    • exaly.com
    csv, json
    Updated Nov 1, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). Meteoritics and Planetary Science - if-computation [Dataset]. https://exaly.com/journal/15702/meteoritics-and-planetary-science/impact-factor
    Explore at:
    json, csvAvailable download formats
    Dataset updated
    Nov 1, 2025
    License

    Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
    License information was derived automatically

    Description

    This graph shows how the impact factor of ^ is computed. The left axis depicts the number of papers published in years X-1 and X-2, and the right axis displays their citations in year X.

  20. d

    COLA Weather and Climate Data

    • search.dataone.org
    Updated Nov 17, 2014
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    IGES Center for Ocean-Land-Atmosphere Studies (COLA), Weather and Climate Data Center (2014). COLA Weather and Climate Data [Dataset]. https://search.dataone.org/view/COLA_Weather_and_Climate_Data.xml
    Explore at:
    Dataset updated
    Nov 17, 2014
    Dataset provided by
    Regional and Global Biogeochemical Dynamics Data (RGD)
    Authors
    IGES Center for Ocean-Land-Atmosphere Studies (COLA), Weather and Climate Data Center
    Time period covered
    Jan 1, 1950
    Area covered
    Earth
    Description

    The Center for Ocean-Land-Atmosphere Studies (COLA) allows earth scientists from several disciplines to work closely together on interdisciplinary research related to variability and predictability of Earth's climate on seasonal to decadal time scales. COLA scientists utilize numerical models of the Earth's global atmosphere, world oceans and land surface biosphere in numerical predictability experiments and experimental predictions, and use advanced techniques for analysis of observational and model data.

    COLA produces maps of current weather conditions, weather forecasts, short-term climate outlooks, metropolitan meteograms, and maximum potential hurricane intensity for the United States, North America, and other selected regions of the world.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
George Petrakis (2023). Keypoint decription for planetary environments [Dataset]. https://www.kaggle.com/datasets/georgepetrakis/unstructured-and-planetary-environments
Organization logo

Keypoint decription for planetary environments

Explore at:
zip(1577493688 bytes)Available download formats
Dataset updated
Sep 23, 2023
Authors
George Petrakis
License

https://creativecommons.org/publicdomain/zero/1.0/https://creativecommons.org/publicdomain/zero/1.0/

Description

This training dataset is focused on deep neural networks which are focused on keypoint detection and description such as SuperPoint (DeTone et al. 2018). The dataset was designed for unstructured and planetary scenes aiming to train learning-based keypoint detectors and descriptors. The dataset contains about 48,000 of images from Earth, Mars and Moon while all the images have been converted in grayscale with a size of 320x240.

The original images from Earth were captured in a quarry and construction sites in Greece while the images from Mars were collected by a publicly available dataset of NASA. The moon images are artificial rover-based images which were generated and released with CC (Creative Commons) license by Keio University in Japan.

The dataset will further enriched soon.

References: DeTone D, Malisiewicz T, Rabinovich A, SuperPoint: Self-Supervised Interest Point Detection and Description, 2018, arXiv:1712.07629

Search
Clear search
Close search
Google apps
Main menu