Facebook
Twitterhttps://cdla.io/sharing-1-0/https://cdla.io/sharing-1-0/
The importance of effectively using Google Cloud Platform (GCP) billing data to gain actionable insights into cloud spending. It emphasizes the need for strategic cost management, offering guidance on how to analyze billing data, optimize resource usage, and implement best practices to minimize costs while maximizing the value derived from cloud services. The subtitle is geared towards businesses and technical teams looking to maintain financial control and improve their cloud operations.
This dataset contains the data of GCP billing cloud cost. For a updated one, comment ! contact !
Facebook
TwitterOpen Database License (ODbL) v1.0https://www.opendatacommons.org/licenses/odbl/1.0/
License information was derived automatically
This dataset corresponds to a three-part Medium.com article series covering a fully reproducible walkthrough of Production-Scale Best Practices for IoT data streaming on GCP. Covered topics include: IoT devices setup and streaming into GCP's IoT Core and PubSub, movement of data from PubSub to BigQuery via Dataflow, visualization of that data in Data Studio, and an effective machine learning model construction and deployment process via BigQuery ML (using AutoML Tables).
The dataset can be ultimately be used to generate a machine learning model that identifies in near real-time whether or not a particular window in my home is open based on temperature values from three different sensors. The entire workflow from data ingestion to continually deployed ML predictions are achieved with a fully-managed, auto-scaling, and serverless architecture.
Present in this dataset are two files:
All GitHub gists showcasing how to manipulate and make use of these files are present in the Medium.com articles:
Part 1 (IoT device setup and streaming into IoT Core / PubSub): https://blog.doit-intl.com/production-scale-iot-best-practices-implementation-with-gcp-part-1-3-44e2fa0e6554
Part 2 (Shuttling IoT data from PubSub into BigQuery via Dataflow): https://blog.doit-intl.com/production-scale-iot-best-practices-implementation-with-gcp-part-2-3-4a9e59d51214
Part 3 (BigQuery ML model training, deployment, and near real-time predictions): https://blog.doit-intl.com/production-scale-iot-best-practices-implementation-with-google-cloud-part-3-3-7f2fa99f6785
This dataset and the corresponding articles were produced by Matthew Porter, a Senior Cloud Architect at DoiT International.
My beautiful nine week old Corgi puppy Maple, whose as-of-yet lack of potty training has led to many air-refreshing opened windows in the heart of winter!
Facebook
TwitterAerial imagery acquired with a small unmanned aircraft system (sUAS), in conjunction with surveyed ground control points (GCPs) visible in the imagery, can be processed with structure-from-motion (SfM) photogrammetry techniques to produce high-resolution orthomosaics, three-dimensional (3D) point clouds and digital elevation models (DEMs). This dataset, prepared by the U.S. Geological Survey (USGS) St. Petersburg Coastal and Marine Science Center (SPCMSC), provides UAS survey data consisting of aerial imagery and GCP positions and elevations collected at Madeira Beach, Florida, monthly from July 2017 to June 2018 in order to observe seasonal and storm-induced changes in beach topography.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This resource contains Jupyter Python notebooks which are intended to be used to learn about the U.S. National Water Model (NWM). These notebooks explore NWM forecasts in various ways. NWM Notebooks 1, 2, and 3, access NWM forecasts directly from the NOAA NOMADS file sharing system. Notebook 4 accesses NWM forecasts from Google Cloud Platform (GCP) storage in addition to NOMADS. A brief summary of what each notebook does is included below:
Notebook 1 (NWM1_Visualization) focuses on visualization. It includes functions for downloading and extracting time series forecasts for any of the 2.7 million stream reaches of the U.S. NWM. It also demonstrates ways to visualize forecasts using Python packages like matplotlib.
Notebook 2 (NWM2_Xarray) explores methods for slicing and dicing NWM NetCDF files using the python library, XArray.
Notebook 3 (NWM3_Subsetting) is focused on subsetting NWM forecasts and NetCDF files for specified reaches and exporting NWM forecast data to CSV files.
Notebook 4 (NWM4_Hydrotools) uses Hydrotools, a new suite of tools for evaluating NWM data, to retrieve NWM forecasts both from NOMADS and from Google Cloud Platform storage where older NWM forecasts are cached. This notebook also briefly covers visualizing, subsetting, and exporting forecasts retrieved with Hydrotools.
NOTE: Notebook 4 Requires a newer version of NumPy that is not available on the default CUAHSI JupyterHub instance. Please use the instance "HydroLearn - Intelligent Earth" and ensure to run !pip install hydrotools.nwm_client[gcp].
The notebooks are part of a NWM learning module on HydroLearn.org. When the associated learning module is complete, the link to it will be added here. It is recommended that these notebooks be opened through the CUAHSI JupyterHub App on Hydroshare. This can be done via the 'Open With' button at the top of this resource page.
Not seeing a result you expected?
Learn how you can add new datasets to our index.
Facebook
Twitterhttps://cdla.io/sharing-1-0/https://cdla.io/sharing-1-0/
The importance of effectively using Google Cloud Platform (GCP) billing data to gain actionable insights into cloud spending. It emphasizes the need for strategic cost management, offering guidance on how to analyze billing data, optimize resource usage, and implement best practices to minimize costs while maximizing the value derived from cloud services. The subtitle is geared towards businesses and technical teams looking to maintain financial control and improve their cloud operations.
This dataset contains the data of GCP billing cloud cost. For a updated one, comment ! contact !