4 datasets found
  1. GCP-Cloud-Billing-Data

    • kaggle.com
    zip
    Updated Aug 30, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    SAIRAM N (2024). GCP-Cloud-Billing-Data [Dataset]. https://www.kaggle.com/datasets/sairamn19/gcp-cloud-billing-data
    Explore at:
    zip(59956 bytes)Available download formats
    Dataset updated
    Aug 30, 2024
    Authors
    SAIRAM N
    License

    https://cdla.io/sharing-1-0/https://cdla.io/sharing-1-0/

    Description

    The importance of effectively using Google Cloud Platform (GCP) billing data to gain actionable insights into cloud spending. It emphasizes the need for strategic cost management, offering guidance on how to analyze billing data, optimize resource usage, and implement best practices to minimize costs while maximizing the value derived from cloud services. The subtitle is geared towards businesses and technical teams looking to maintain financial control and improve their cloud operations.

    This dataset contains the data of GCP billing cloud cost. For a updated one, comment ! contact !

  2. Temperature_IoT_on_GCP

    • kaggle.com
    zip
    Updated Feb 8, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    MattPo (2021). Temperature_IoT_on_GCP [Dataset]. https://www.kaggle.com/datasets/mattpo/temperature-iot-on-gcp/discussion
    Explore at:
    zip(76612897 bytes)Available download formats
    Dataset updated
    Feb 8, 2021
    Authors
    MattPo
    License

    Open Database License (ODbL) v1.0https://www.opendatacommons.org/licenses/odbl/1.0/
    License information was derived automatically

    Description

    Context

    This dataset corresponds to a three-part Medium.com article series covering a fully reproducible walkthrough of Production-Scale Best Practices for IoT data streaming on GCP. Covered topics include: IoT devices setup and streaming into GCP's IoT Core and PubSub, movement of data from PubSub to BigQuery via Dataflow, visualization of that data in Data Studio, and an effective machine learning model construction and deployment process via BigQuery ML (using AutoML Tables).

    The dataset can be ultimately be used to generate a machine learning model that identifies in near real-time whether or not a particular window in my home is open based on temperature values from three different sensors. The entire workflow from data ingestion to continually deployed ML predictions are achieved with a fully-managed, auto-scaling, and serverless architecture.

    Content

    Present in this dataset are two files:

    1. temperature.csv. This contains 10.5M rows of raw temperature sensor data streaming up from three different sensors. Sensor 258* and 270* are each positioned close to their own window, while sensor 275* is about 8 feet away from a third window. The column names are largely self-explanatory: timestamp_utc,timestamp_epoch,temp_f,temp_c,device_id
    2. window_opened_closed.csv. This defines the time frames when a particular window was opened. All time points from the raw dataset where a window was not open should be assumed to have all windows closed. The column names are largely self-explanatory: DayPST,StartTimePST,EndTimePST,ObjectCode,ObjectName

    All GitHub gists showcasing how to manipulate and make use of these files are present in the Medium.com articles:

    Part 1 (IoT device setup and streaming into IoT Core / PubSub): https://blog.doit-intl.com/production-scale-iot-best-practices-implementation-with-gcp-part-1-3-44e2fa0e6554

    Part 2 (Shuttling IoT data from PubSub into BigQuery via Dataflow): https://blog.doit-intl.com/production-scale-iot-best-practices-implementation-with-gcp-part-2-3-4a9e59d51214

    Part 3 (BigQuery ML model training, deployment, and near real-time predictions): https://blog.doit-intl.com/production-scale-iot-best-practices-implementation-with-google-cloud-part-3-3-7f2fa99f6785

    Acknowledgements

    This dataset and the corresponding articles were produced by Matthew Porter, a Senior Cloud Architect at DoiT International.

    Inspiration

    My beautiful nine week old Corgi puppy Maple, whose as-of-yet lack of potty training has led to many air-refreshing opened windows in the heart of winter!

  3. d

    Data from: Time Series of Aerial Imagery from Small Unmanned Aircraft...

    • datasets.ai
    • data.usgs.gov
    • +1more
    55
    Updated Jun 1, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Department of the Interior (2023). Time Series of Aerial Imagery from Small Unmanned Aircraft Systems and Associated Ground Control Points: Madeira Beach, Florida, July 2017 to June 2018 (Aerial Imagery) [Dataset]. https://datasets.ai/datasets/time-series-of-aerial-imagery-from-small-unmanned-aircraft-systems-and-associated-ground-c
    Explore at:
    55Available download formats
    Dataset updated
    Jun 1, 2023
    Dataset authored and provided by
    Department of the Interior
    Area covered
    Florida, Madeira Beach
    Description

    Aerial imagery acquired with a small unmanned aircraft system (sUAS), in conjunction with surveyed ground control points (GCPs) visible in the imagery, can be processed with structure-from-motion (SfM) photogrammetry techniques to produce high-resolution orthomosaics, three-dimensional (3D) point clouds and digital elevation models (DEMs). This dataset, prepared by the U.S. Geological Survey (USGS) St. Petersburg Coastal and Marine Science Center (SPCMSC), provides UAS survey data consisting of aerial imagery and GCP positions and elevations collected at Madeira Beach, Florida, monthly from July 2017 to June 2018 in order to observe seasonal and storm-induced changes in beach topography.

  4. H

    National Water Model HydroLearn Python Notebooks

    • hydroshare.org
    zip
    Updated Nov 14, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dan Ames; Justin Hunter (2023). National Water Model HydroLearn Python Notebooks [Dataset]. http://doi.org/10.4211/hs.5949aec47b484e689573beeb004a2917
    Explore at:
    zip(1.8 MB)Available download formats
    Dataset updated
    Nov 14, 2023
    Dataset provided by
    HydroShare
    Authors
    Dan Ames; Justin Hunter
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Description

    This resource contains Jupyter Python notebooks which are intended to be used to learn about the U.S. National Water Model (NWM). These notebooks explore NWM forecasts in various ways. NWM Notebooks 1, 2, and 3, access NWM forecasts directly from the NOAA NOMADS file sharing system. Notebook 4 accesses NWM forecasts from Google Cloud Platform (GCP) storage in addition to NOMADS. A brief summary of what each notebook does is included below:

    Notebook 1 (NWM1_Visualization) focuses on visualization. It includes functions for downloading and extracting time series forecasts for any of the 2.7 million stream reaches of the U.S. NWM. It also demonstrates ways to visualize forecasts using Python packages like matplotlib.

    Notebook 2 (NWM2_Xarray) explores methods for slicing and dicing NWM NetCDF files using the python library, XArray.

    Notebook 3 (NWM3_Subsetting) is focused on subsetting NWM forecasts and NetCDF files for specified reaches and exporting NWM forecast data to CSV files.

    Notebook 4 (NWM4_Hydrotools) uses Hydrotools, a new suite of tools for evaluating NWM data, to retrieve NWM forecasts both from NOMADS and from Google Cloud Platform storage where older NWM forecasts are cached. This notebook also briefly covers visualizing, subsetting, and exporting forecasts retrieved with Hydrotools.

    NOTE: Notebook 4 Requires a newer version of NumPy that is not available on the default CUAHSI JupyterHub instance. Please use the instance "HydroLearn - Intelligent Earth" and ensure to run !pip install hydrotools.nwm_client[gcp].

    The notebooks are part of a NWM learning module on HydroLearn.org. When the associated learning module is complete, the link to it will be added here. It is recommended that these notebooks be opened through the CUAHSI JupyterHub App on Hydroshare. This can be done via the 'Open With' button at the top of this resource page.

  5. Not seeing a result you expected?
    Learn how you can add new datasets to our index.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
SAIRAM N (2024). GCP-Cloud-Billing-Data [Dataset]. https://www.kaggle.com/datasets/sairamn19/gcp-cloud-billing-data
Organization logo

GCP-Cloud-Billing-Data

Real Time Billing Data of a company in GCP

Explore at:
zip(59956 bytes)Available download formats
Dataset updated
Aug 30, 2024
Authors
SAIRAM N
License

https://cdla.io/sharing-1-0/https://cdla.io/sharing-1-0/

Description

The importance of effectively using Google Cloud Platform (GCP) billing data to gain actionable insights into cloud spending. It emphasizes the need for strategic cost management, offering guidance on how to analyze billing data, optimize resource usage, and implement best practices to minimize costs while maximizing the value derived from cloud services. The subtitle is geared towards businesses and technical teams looking to maintain financial control and improve their cloud operations.

This dataset contains the data of GCP billing cloud cost. For a updated one, comment ! contact !

Search
Clear search
Close search
Google apps
Main menu