The total amount of data created, captured, copied, and consumed globally is forecast to increase rapidly, reaching 149 zettabytes in 2024. Over the next five years up to 2028, global data creation is projected to grow to more than 394 zettabytes. In 2020, the amount of data created and replicated reached a new high. The growth was higher than previously expected, caused by the increased demand due to the COVID-19 pandemic, as more people worked and learned from home and used home entertainment options more often. Storage capacity also growing Only a small percentage of this newly created data is kept though, as just two percent of the data produced and consumed in 2020 was saved and retained into 2021. In line with the strong growth of the data volume, the installed base of storage capacity is forecast to increase, growing at a compound annual growth rate of 19.2 percent over the forecast period from 2020 to 2025. In 2020, the installed base of storage capacity reached 6.7 zettabytes.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The complete COVID-19 dataset is a collection of the COVID-19 data maintained by Our World in Data that is updated throughout the duration of COVID-19. It includes information related to confirmed cases and deaths, hospitalization, intensive care unit admissions, testing for COVID-19, and vaccination for COVID-19.Confirmed cases and deaths: this data is collected from the World Health Organization Coronavirus Dashboard. The cases & deaths dataset is updated daily.Note 1: Time/date stamps reflect when the data was last updated by WHO. Due to the time required to process and validate the incoming data, there is a delay between reporting to WHO and the update of the dashboard.Note 2: Counts and corrections made after these times will be carried forward to the next reporting cycle for that specific region. Delayed reporting for any specific country, territory or area may result in pooled counts for multiple days being presented, with a retrospective update to counts on previous days to accurately reflect trends. Significant data errors detected or reported to WHO may be corrected at more frequent intervals.Hospitalizations and intensive care unit (ICU) admissions: our data is collected from official sources and collated by Our World in Data. The complete list of country-by-country sources is available here.Testing for COVID-19: this data is collected by the Our World in Data team from official reports; you can find further details in our post on COVID-19 testing, including our checklist of questions to understand testing data, information on geographical and temporal coverage, and detailed country-by-country source information. On 23 June 2022, we stopped adding new datapoints to our COVID-19 testing dataset. You can read more here.Vaccinations against COVID-19: this data is collected by the Our World in Data team from official reports.Other variables: this data is collected from a variety of sources (United Nations, World Bank, Global Burden of Disease, Blavatnik School of Government, etc.). More information is available in our codebook.
As of March 2025, there were a reported 5,426 data centers in the United States, the most of any country worldwide. A further 529 were located in Germany, while 523 were located in the United Kingdom. What is a data center? A data center is a network of computing and storage resources that enables the delivery of shared software applications and data. These facilities can house large amounts of critical and important data, and therefore are vital to the daily functions of companies and consumers alike. As a result, whether it is a cloud, colocation, or managed service, data center real estate will have increasing importance worldwide. Hyperscale data centers In the past, data centers were highly controlled physical infrastructures, but the cloud has since changed that model. A cloud data service is a remote version of a data center – located somewhere away from a company's physical premises. Cloud IT infrastructure spending has grown and is forecast to rise further in the coming years. The evolution of technology, along with the rapid growth in demand for data across the globe, is largely driven by the leading hyperscale data center providers.
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
This article introduces the most comprehensive dataset on de jure central bank independence (CBI), including yearly data from 182 countries between 1970 and 2012. The dataset identifies statutory reforms affecting CBI, their direction, and the attributes necessary to build the Cukierman, Webb and Neyapty index. Previous datasets focused on developed countries, and included non-representative samples of developing countries. This dataset’s substantially broader coverage has important implications. First, it challenges the conventional wisdom about central bank reforms in the world, revealing CBI increases and restrictions in decades and regions previously considered barely affected by reforms. Second, the inclusion of almost 100 countries usually overlooked in previous studies suggests that the sample selection may have substantially affected results. Simple analyses show that the associations between CBI and inflation, unemployment or growth are very sensitive to sample selection. Finally, the dataset identifies numerous CBI decreases (restrictions), whereas previous datasets mostly look at CBI increases. These data’s coverage not only allows researchers to test competing explanations of the determinants and effects of CBI in a global sample, but it also provides a useful instrument for cross-national studies in diverse fields, such as liberalization, diffusion, political institutions, democratization, or responses to financial crises.
Chaque jour, le John Hopkins CSSE publie quotidiennement les données des cas confirmés, guéris et décédés dans le monde, sur le dépôt Github . Gisaïa réutilise ces données pour son affichage dans ARLAS Exploration . Terms of Use from the Johns Hopkins CSSE: This GitHub repo and its contents herein, including all data, mapping, and analysis, copyright 2020 Johns Hopkins University, all rights reserved, is provided to the public strictly for educational and academic research purposes. The Website relies upon publicly available data from multiple sources, that do not always agree. The Johns Hopkins University hereby disclaims any and all representations and warranties with respect to the Website, including accuracy, fitness for use, and merchantability. Reliance on the Website for medical guidance or use of the Website in commerce is strictly prohibited.
This dataset contains data presented on the World Emissions Clock hosted by the World Data Lab.
The World Emissions Clock provides trajectories of future greenhouse gas emissions until 2050 for 180 countries, five main sectors and up to 24 subsectors, and three different scenarios. These hypothetical scenarios are:
Business as usual (BAU), where technological advancement and policy-making roughly follows past trends without major shifts.
Nationally determined contributions (NDC), where countries fully implement their unconditional climate pledges as submitted to the United Nations Framework Convention on Climate Change (UNFCCC).
Achieving 1.5°C, where secotral emissions within countries follow a cost-efficient pathway towards limiting global warming to 1.5° Celsius by 2100.
For further information, see the Methodology section of the World Emissions Clock. Contact wec@worlddata.io for access information.
The World Emissions Clock was created in a cooperation of the World Data Lab with the International Institute for Applied Systems Analysis (IIASA), the Vienna University of Economics and Business (WU Vienna), and the University of Oxford and was supported from the German Federal Ministry for Economic Cooperation and Development (BMZ), the German Agency for International Cooperation (GIZ), and the Patrick J. McGovern Foundation.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Collection of resources used by crisismappers for finding data
This dataset shows the Battery Electric Vehicles (BEVs) and Plug-in Hybrid Electric Vehicles (PHEVs) that are currently registered through Washington State Department of Licensing (DOL).
This dataset is the basis for the International Food Security Assessment, 2016-2026 released in June 2016. This annual ERS report projects food availability and access for 76 low- and middle-income countries over a 10-year period. The dataset includes annual country-level data on area, yield, production, nonfood use, trade, and consumption for grains and root and tuber crops (combined as R&T in the documentation tables), food aid, total value of imports and exports, gross domestic product, and population compiled from a variety of sources.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Contains data from the World Bank's data portal covering the following topics which also exist as individual datasets on HDX: Agriculture and Rural Development, Aid Effectiveness, Economy and Growth, Education, Energy and Mining, Environment, Financial Sector, Health, Infrastructure, Social Protection and Labor, Private Sector, Public Sector, Science and Technology, Social Development, Urban Development, Gender, Climate Change, External Debt, Trade.
The global big data market is forecasted to grow to 103 billion U.S. dollars by 2027, more than double its expected market size in 2018. With a share of 45 percent, the software segment would become the large big data market segment by 2027.
What is Big data?
Big data is a term that refers to the kind of data sets that are too large or too complex for traditional data processing applications. It is defined as having one or some of the following characteristics: high volume, high velocity or high variety. Fast-growing mobile data traffic, cloud computing traffic, as well as the rapid development of technologies such as artificial intelligence (AI) and the Internet of Things (IoT) all contribute to the increasing volume and complexity of data sets.
Big data analytics
Advanced analytics tools, such as predictive analytics and data mining, help to extract value from the data and generate new business insights. The global big data and business analytics market was valued at 169 billion U.S. dollars in 2018 and is expected to grow to 274 billion U.S. dollars in 2022. As of November 2018, 45 percent of professionals in the market research industry reportedly used big data analytics as a research method.
During the third quarter of 2024, data breaches exposed more than 422 million records worldwide. Since the first quarter of 2020, the highest number of data records were exposed in the first quarter of 202, more than 818 million data sets. Data breaches remain among the biggest concerns of company leaders worldwide. The most common causes of sensitive information loss were operating system vulnerabilities on endpoint devices. Which industries see the most data breaches? Meanwhile, certain conditions make some industry sectors more prone to data breaches than others. According to the latest observations, the public administration experienced the highest number of data breaches between 2021 and 2022. The industry saw 495 reported data breach incidents with confirmed data loss. The second were financial institutions, with 421 data breach cases, followed by healthcare providers. Data breach cost Data breach incidents have various consequences, the most common impact being financial losses and business disruptions. As of 2023, the average data breach cost across businesses worldwide was 4.45 million U.S. dollars. Meanwhile, a leaked data record cost about 165 U.S. dollars. The United States saw the highest average breach cost globally, at 9.48 million U.S. dollars.
Note: In these datasets, a person is defined as up to date if they have received at least one dose of an updated COVID-19 vaccine. The Centers for Disease Control and Prevention (CDC) recommends that certain groups, including adults ages 65 years and older, receive additional doses.
On 6/16/2023 CDPH replaced the booster measures with a new “Up to Date” measure based on CDC’s new recommendations, replacing the primary series, boosted, and bivalent booster metrics The definition of “primary series complete” has not changed and is based on previous recommendations that CDC has since simplified. A person cannot complete their primary series with a single dose of an updated vaccine. Whereas the booster measures were calculated using the eligible population as the denominator, the new up to date measure uses the total estimated population. Please note that the rates for some groups may change since the up to date measure is calculated differently than the previous booster and bivalent measures.
This data is from the same source as the Vaccine Progress Dashboard at https://covid19.ca.gov/vaccination-progress-data/ which summarizes vaccination data at the county level by county of residence. Where county of residence was not reported in a vaccination record, the county of provider that vaccinated the resident is included. This applies to less than 1% of vaccination records. The sum of county-level vaccinations does not equal statewide total vaccinations due to out-of-state residents vaccinated in California.
These data do not include doses administered by the following federal agencies who received vaccine allocated directly from CDC: Indian Health Service, Veterans Health Administration, Department of Defense, and the Federal Bureau of Prisons.
Totals for the Vaccine Progress Dashboard and this dataset may not match, as the Dashboard totals doses by Report Date and this dataset totals doses by Administration Date. Dose numbers may also change for a particular Administration Date as data is updated.
Previous updates:
On March 3, 2023, with the release of HPI 3.0 in 2022, the previous equity scores have been updated to reflect more recent community survey information. This change represents an improvement to the way CDPH monitors health equity by using the latest and most accurate community data available. The HPI uses a collection of data sources and indicators to calculate a measure of community conditions ranging from the most to the least healthy based on economic, housing, and environmental measures.
Starting on July 13, 2022, the denominator for calculating vaccine coverage has been changed from age 5+ to all ages to reflect new vaccine eligibility criteria. Previously the denominator was changed from age 16+ to age 12+ on May 18, 2021, then changed from age 12+ to age 5+ on November 10, 2021, to reflect previous changes in vaccine eligibility criteria. The previous datasets based on age 16+ and age 5+ denominators have been uploaded as archived tables.
Starting on May 29, 2021 the methodology for calculating on-hand inventory in the shipped/delivered/on-hand dataset has changed. Please see the accompanying data dictionary for details. In addition, this dataset is now down to the ZIP code level.
https://datacatalog.worldbank.org/public-licenses?fragment=cchttps://datacatalog.worldbank.org/public-licenses?fragment=cc
Developed by SOLARGIS and provided by the Global Solar Atlas (GSA), this data resource contains terrain elevation above sea level (ELE) in [m a.s.l.] covering the globe. Data is provided in a geographic spatial reference (EPSG:4326). The resolution (pixel size) of solar resource data (GHI, DIF, GTI, DNI) is 9 arcsec (nominally 250 m), PVOUT and TEMP 30 arcsec (nominally 1 km) and OPTA 2 arcmin (nominally 4 km).
The data is hyperlinked under 'resources' with the following characeristics:
ELE - GISdata (GeoTIFF)
Data format: GEOTIFF
File size : 826.8 MB
There are two temporal representation of solar resource and PVOUT data available:
• Longterm yearly/monthly average of daily totals (LTAym_AvgDailyTotals)
• Longterm average of yearly/monthly totals (LTAym_YearlyMonthlyTotals)
Both type of data are equivalent, you can select the summarization of your preference. The relation between datasets is described by simple equations:
• LTAy_YearlyTotals = LTAy_DailyTotals * 365.25
• LTAy_MonthlyTotals = LTAy_DailyTotals * Number_of_Days_In_The_Month
*For individual country or regional data downloads please see: https://globalsolaratlas.info/download (use the drop-down menu to select country or region of interest)
*For data provided in AAIGrid please see: https://globalsolaratlas.info/download/world.
For more information and terms of use, please, read metadata, provided in PDF and XML format for each data layer in a download file. For other data formats, resolution or time aggregation, please, visit Solargis website. Data can be used for visualization, further processing, and geo-analysis in all mainstream GIS software with raster data processing capabilities (such as open source QGIS, commercial ESRI ArcGIS products and others).
https://www.meticulousresearch.com/privacy-policyhttps://www.meticulousresearch.com/privacy-policy
Real-world Data (RWD) Market by Source (EMR, Claims, Pharmacy, Disease Registries), Application [Market Access, Drug Development & Approvals (Oncology, Neurology), Post Market Surveillance], End User (Pharma, Payers, Providers) - Global Forecast to 2029
U.S. and foreign publications held at the World Data Center-A for Oceanography, NODC, Silver Spring, MD.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The article discusses the global market for data processing servers, predicting continued growth in demand over the next decade. The market is expected to expand with an anticipated CAGR of +1.3% in volume and +2.5% in value between 2024 and 2035, reaching 112M units and $134.3B respectively by the end of 2035.
This archived Paleoclimatology Study is available from the NOAA National Centers for Environmental Information (NCEI), under the World Data Service (WDS) for Paleoclimatology. The associated NCEI study type is Paleoceanography. The data include parameters of paleoceanography with a geographic location of North Atlantic Ocean. The time period coverage is from 512210 to 958 in calendar years before present (BP). See metadata information for parameter and study location details. Please cite this study when using the data.
Compilation of Earth Surface temperatures historical. Source: https://www.kaggle.com/berkeleyearth/climate-change-earth-surface-temperature-data
Data compiled by the Berkeley Earth project, which is affiliated with Lawrence Berkeley National Laboratory. The Berkeley Earth Surface Temperature Study combines 1.6 billion temperature reports from 16 pre-existing archives. It is nicely packaged and allows for slicing into interesting subsets (for example by country). They publish the source data and the code for the transformations they applied. They also use methods that allow weather observations from shorter time series to be included, meaning fewer observations need to be thrown away.
In this dataset, we have include several files:
Global Land and Ocean-and-Land Temperatures (GlobalTemperatures.csv):
%3C!-- --%3E
%3C!-- --%3E
%3C!-- --%3E
%3C!-- --%3E
%3C!-- --%3E
%3C!-- --%3E
%3C!-- --%3E
%3C!-- --%3E
%3C!-- --%3E
**Other files include: **
%3C!-- --%3E
%3C!-- --%3E
%3C!-- --%3E
%3C!-- --%3E
The raw data comes from the Berkeley Earth data page.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Learn about the increasing demand for data processing servers worldwide and how the market is expected to grow over the next decade in terms of volume and value.
The total amount of data created, captured, copied, and consumed globally is forecast to increase rapidly, reaching 149 zettabytes in 2024. Over the next five years up to 2028, global data creation is projected to grow to more than 394 zettabytes. In 2020, the amount of data created and replicated reached a new high. The growth was higher than previously expected, caused by the increased demand due to the COVID-19 pandemic, as more people worked and learned from home and used home entertainment options more often. Storage capacity also growing Only a small percentage of this newly created data is kept though, as just two percent of the data produced and consumed in 2020 was saved and retained into 2021. In line with the strong growth of the data volume, the installed base of storage capacity is forecast to increase, growing at a compound annual growth rate of 19.2 percent over the forecast period from 2020 to 2025. In 2020, the installed base of storage capacity reached 6.7 zettabytes.