This release is for quarters 1 to 4 of 2019 to 2020.
Local authority commissioners and health professionals can use these resources to track how many pregnant women, children and families in their local area have received health promoting reviews at particular points during pregnancy and childhood.
The data and commentaries also show variation at a local, regional and national level. This can help with planning, commissioning and improving local services.
The metrics cover health reviews for pregnant women, children and their families at several stages which are:
Public Health England (PHE) collects the data, which is submitted by local authorities on a voluntary basis.
See health visitor service delivery metrics in the child and maternal health statistics collection to access data for previous years.
Find guidance on using these statistics and other intelligence resources to help you make decisions about the planning and provision of child and maternal health services.
See health visitor service metrics and outcomes definitions from Community Services Dataset (CSDS).
Since publication in November 2020, Lewisham and Leicestershire councils have identified errors in the new birth visits within 14 days data it submitted to Public Health England (PHE) for 2019 to 2020 data. This error has caused a statistically significant change in the health visiting data for 2019 to 2020, and so the Office for Health Improvement and Disparities (OHID) has updated and reissued the data in OHID’s Fingertips tool.
A correction notice has been added to the 2019 to 2020 annual statistical release and statistical commentary but the data has not been altered.
Please consult OHID’s Fingertips tool for corrected data for Lewisham and Leicestershire, the London and East Midlands region, and England.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Errata: Due to a coding error, monthly files with "dma8epax" statistics were wrongly aggregated. This concerns all gridded files of this metric as well as the monthly aggregated csv files. All erroneous files were replaced with corrected versions on Jan, 16th, 2018. Each updated file contains a version label "1.1" and a brief description of the error. If you have made use of previous TOAR data files with the "dma8epax" metric, please exchange your data files.
Almost nine in ten data center and IT managers responding to a 2024 worldwide survey said that their organizations were collecting power consumption data, while around three quarters were collecting and compiling power usage effectiveness (PUE). PUE is a common metric used to express the share of total data center power consumption that is used to power critical IT infrastructure, as opposed to secondary systems such as cooling, lighting, and power distribution. While PUE can offer insights into a facility’s energy efficiency, data center operators are facing increasing pressure to report more comprehensive sustainability metrics, including renewable energy usage, carbon emissions, and water consumption.
api.data.gov is a free API management service for federal agencies. This API offers access to high level metrics for the APIs that are using the shared service. This API is used to power the api.data.gov metrics page.
During an early 2024 global survey among marketing and media leaders, around ** percent reported monitoring social media engagement to track the performance of their content marketing efforts. The same share of respondents cited website engagement, while pageviews rounded up the top three, mentioned by ** percent of the interviewees. According to the same study, around ** percent of global marketers planned to raise their content budgets throughout 2024.
The report contains thirteen (13) performance metrics for City's workforce development programs. Each metric can be breakdown by three demographic types (gender, race/ethnicity, and age group) and the program target population (e.g., youth and young adults, NYCHA communities) as well.
This report is a key output of an integrated data system that collects, integrates, and generates disaggregated data by Mayor's Office for Economic Opportunity (NYC Opportunity). Currently, the report is generated by the integrated database incorporating data from 18 workforce development programs managed by 5 City agencies.
There has been no single "workforce development system" in the City of New York. Instead, many discrete public agencies directly manage or fund local partners to deliver a range of different services, sometimes tailored to specific populations. As a result, program data have historically been fragmented as well, making it challenging to develop insights based on a comprehensive picture. To overcome it, NYC Opportunity collects data from 5 City agencies and builds the integrated database, and it begins to build a complete picture of how participants move through the system onto a career pathway.
Each row represents a count of unique individuals for a specific performance metric, program target population, a specific demographic group, and a specific period. For example, if the Metric Value is 2000 with Clients Served (Metric Name), NYCHA Communities (Program Target Population), Asian (Subgroup), and 2019 (Period), you can say that "In 2019, 2,000 Asian individuals participated programs targeting NYCHA communities.
Please refer to the Workforce Data Portal for further data guidance (https://workforcedata.nyc.gov/en/data-guidance), and interactive visualizations for this report (https://workforcedata.nyc.gov/en/common-metrics).
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Errata: On Dec 2nd, 2018, several yearly statistics files were replaced with new versions to correct an inconsistency related to the computation of the "dma8epax" statistics. As written in Schultz et al. (2017) [https://doi.org/10.1525/elementa.244], Supplement 1, Table 6: "When the aggregation period is “seasonal”, “summer”, or “annual”, the 4th highest daily 8-hour maximum of the aggregation period will be computed.". The data values for these aggregation periods are correct, however, the header information in the original files stated that the respective data column would contain "average daily maximum 8-hour ozone mixing ratio (nmol mol-1)". Therefore, the header of the seasonal, summer, and annual files has been corrected. Furthermore, the "dma8epax" column in the monthly files erroneously contained 4th highest daily maximum 8-hour average values, while it should have listed monthly average values instead. The data of this metric in the monthly files have therefore been replaced. The new column header reads "avgdma8epax". The updated files contain a version label "1.1" and a brief description of the error. If you have made use of previous TOAR data files with the "dma8epax" metric, please exchange your data files.
The 311 website allows residents to submit service requests or check the status of existing requests online. The percentage of 311 website uptime, the amount of time the site was available, and the target uptime for each week are available by mousing over columns. The target availability for this site is 99.5%.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Note: This dataset is on hiatus.
CDPH strives to respond equitably to the COVID-19 pandemic and is therefore interested in how different communities are impacted. Collecting and reporting health equity data helps to identify health disparities and improve the state’s response. To that end, CDPH tracks cases, deaths, and testing by race and ethnicity as well as other social determinants of health, such as income, crowded housing, and access to health insurance.
During the response, CDPH used a health equity metric, defined as the positivity rate in the most disproportionately-impacted communities according to the Healthy Places Index. The purpose of this metric was to ensure California reopened its economy safely by reducing disease transmission in all communities. This metric is tracked and reported in comparison to statewide positivity rate. More information is available at https://www.cdph.ca.gov/Programs/CID/DCDC/Pages/COVID-19/CaliforniaHealthEquityMetric.aspx.
Data completeness is also critical to addressing inequities. CDPH reports data completeness by race and ethnicity, sexual orientation, and gender identity to better understand missingness in the data.
Health equity data is updated weekly. Data may be suppressed based on county population or total counts.
For more information on California’s commitment to health equity, please see https://covid19.ca.gov/equity/
This metric tracks the number of young people enrolled in youth services per month. DFSS is committed to creating a premier out-of-school time system that provides young people the opportunity to participate in high-quality, safe, and structured programs. DFSS funds over 200 Out-of-School Time (OST) programs that serve youth between the ages of 6 to 18 years across the city of Chicago in five types of programs: Academic/Vocational Support and Enrichment; Science, Computer, and Technology; Arts and Culture; Sports, Fitness, Health, and Nutrition; and Innovative. Missing: This dataset does not include additional OST programs supported by other city agencies such as the Chicago Park District, Chicago Public Schools, the Chicago Housing Authority, etc. • Academic/Vocational Support and Enrichment - academic support, remedial education services, tutoring, literacy, and reconnecting youth with other educational opportunities • Science, Computer, and Technology - skills building focused on computer programming, software, and technology • Arts and Culture - promoting excellence in the arts through access, awareness and opportunities for creative expression, increased cultural awareness, and demonstrative skills concluding with an event, play or exhibit • Sports, Fitness, Health, and Nutrition - opportunities for physical activities and education that supports healthy choices and a positive lifestyle • Innovative – opportunities for youth ages 13 to 15 and 16 to 18 that provide customized projects supporting skills building in areas such as civic engagement, entrepreneurship, workforce development, and post-secondary education to prepare youth for the job market and life-long learning
Population metrics are provided at the census tract, planning district, and citywide levels of geography. You can find related vital statistics tables that contain aggregate metrics on natality (births) and mortality (deaths) of Philadelphia residents as well as social determinants of health metrics at the city and planning district levels of geography. Please refer to the metadata links below for variable definitions and the technical notes document to access detailed technical notes and variable definitions.
https://www.gnu.org/licenses/gpl-3.0.htmlhttps://www.gnu.org/licenses/gpl-3.0.html
This repository consists of Data/Code to reproduce the results of the paper "Metric-DST: Mitigating Selection Bias Through Diversity-guided Semi Supervised Metric Learning".
The data is shared at: https://doi.org/10.6084/m9.figshare.27720726.v1
The code is shared at: https://github.com/joanagoncalveslab/Metric-DST
State-reported data on Medicaid and CHIP eligibility renewals conducted during the reporting period and call center operations
Sources:
(1) March and April 2023 state Medicaid and CHIP Renewal and Termination Data for the Unwinding Data Report as of June 13, 2023. Florida's March and April 2023 Medicaid and CHIP Renewal and Termination Data for the Unwinding Data Report as of June 05, 2023. May 2023 state Medicaid and CHIP Renewal and Termination Data for the Unwinding Data Report as of July 12, 2023. Florida's May 2023 Medicaid and CHIP Renewal and Termination Data for the Unwinding Data Report as of July 03, 2023. June 2023 state Medicaid and CHIP Renewal and Termination Data for the Unwinding Data Report as of August 16, 2023. Florida's June 2023 Medicaid and CHIP Renewal and Termination Data for the Unwinding Data Report as of July 31, 2023. July 2023 state Medicaid and CHIP Renewal and Termination Data for the Unwinding Data Report as of September 12, 2023. August 2023 state Medicaid and CHIP Renewal and Termination Data for the Unwinding Data Report as of October 23, 2023. September 2023 state Medicaid and CHIP Renewal and Termination Data for the Unwinding Data Report as of November 07, 2023. Delaware’s September state Medicaid and CHIP Renewal and Termination Data for the Unwinding Data Report as of November 28, 2023. October 2023 state Medicaid and CHIP Renewal and Termination Data for the Unwinding Data Report as of December 05, 2023. November 2023 state Medicaid and CHIP Renewal and Termination Data for the Unwinding Data Report as of January 05, 2024. December 2023 state Medicaid and CHIP Renewal and Termination Data for the Unwinding Data Report as of February 08, 2024. January 2024 state Medicaid and CHIP Renewal and Termination Data for the Unwinding Data Report as of March 05, 2024. February 2024 state Medicaid and CHIP Renewal and Termination Data for the Unwinding Data Report as of April 02, 2024. The total number of Medicaid and CHIP beneficiaries for whom a renewal was initiated in the reporting month (metric 4) for Idaho and Nebraska as of April 12, 2024. March 2024 state Medicaid and CHIP Renewal and Termination Data for the Unwinding Data Report as of May 07, 2024. April 2024 state Medicaid and CHIP Renewal and Termination Data for the Unwinding Data Report as of June 11, 2024. May 2024 state Medicaid and CHIP Renewal and Termination Data for the Unwinding Data Report as of July 02, 2024. June 2024 state Medicaid and CHIP Renewal and Termination Data for the Unwinding Data Report as of August 06, 2024. July 2024 state Medicaid and CHIP Renewal and Termination Data for the Unwinding Data Report as of September 09, 2024. (2) Call Center Data from the Medicaid and CHIP Eligibility and Enrollment Performance Indicator Data as of September 10, 2024.
Notes: For all states, data may be affected by mitigation strategies in place, such as those related to ex parte functionality. Georgia reported data for individuals who continue to be eligible following a change in circumstances and were granted a new 12-month eligibility period during the April - July 2024 reporting periods, along with data on individuals due for renewal in these months. South Dakota did not initiate or complete renewals in the March - July 2024 reporting period due to a mitigation strategy for ex parte functionality. South Dakota did not initiate renewals in the February 2024 reporting period due to a mitigation strategy for ex parte functionality. Due to temporary renewal process changes, most renewals due in Iowa, including ex parte renewals, were not completed by the end of the reporting month for the December 2023 - February 2024 reporting periods. Hawaii and Vermont experienced a natural disaster, and the number of renewals initiated and completed in the reporting period were impacted due to the disaster response efforts in the month of August 2023. South Carolina does not have renewal outcomes to report in the month of July 2023. Massachusetts reports the dispositions of renewals completed in the reporting period. Therefore, the state is unable to report the number of pending renewals to CMS, and Massachusetts’ data are excluded from the national totals for the May - July 2023 reporting periods. Vermont experienced a natural disaster, and the number of renewals initiated and completed in the July 2023 reporting period were impacted due to the disaster response efforts. Texas does not have renewal outcomes to report in the month of June 2023. For the May 2023 reporting period, Texas’ renewal disposition data include actions completed after the end of the reporting period. For the April 2023 reporting period, Iowa's renewal disposition data include actions completed after the end of the reporting period. States report renewal outcomes for a cohort in the month renewals are scheduled for completion. States that held procedural terminations reflect these data in metric 5d as pending renewals along with renewals that were due but not completed by the end of the reporting period. See the Data Sources and Metrics Definitions Overview document for a full description of the metric definitions and how they relate to each other.
Starting in the February 2024 reporting period, state-specific data may vary depending on states' unwinding timelines. State-specific data may include both unwinding renewals, or renewals conducted following the end of the continuous enrollment condition, and "regular", or non-unwinding renewals.
Metric summation note: Total number of beneficiaries due for renewal in the reporting month (Metric 5) = Total number of beneficiaries due for renewal in the reporting month whose coverage was renewed (Metric 5a) + Total beneficiaries determined ineligible for Medicaid and CHIP based on the return of a renewal form (Metric 5b) + Total beneficiaries whose coverage was terminated for a procedural or administrative reason (Metric 5c) + Total number of beneficiaries due for renewal in the reporting month whose renewal was pending at the end of the month (Metric 5d); Total number of beneficiaries due for renewal in the reporting month whose coverage was renewed (Metric 5a) = Total beneficiaries renewed on an ex parte basis (i.e., based on available information) (Metric 5a1) + Total beneficiaries renewed based on the return of a renewal form (Metric 5a2).
Data notes:
(C1) Callbacks are included
(C2) Call centers offer callbacks
(C3) Does not include all calls received by call centers
(C4) Does not include all calls received after business hours
(C5) Includes calls for other benefit programs
(C6) Includes only calls transferred to a live agent
(C7) Includes state-based marketplace (SBM) data
(C8) New call center added in reporting period
(C9) Wait times for callbacks are included but reported separately from live calls
(C10) Does not operate a call center
(C11) Calls handled to completion by the automated system are counted as abandoned calls
(C12) Did not report data because of technical reasons
(C13) Wait time reflects time spent on hold after the call is answered by a live agent
(R1) Initiated two months' renewals due to delayed system modifications
(R2) Data do not reflect all Medicaid and CHIP enrollees with renewal actions in the reporting period
(R3) Reporting renewals at case level, as opposed to individual level
(R4) State held some or all procedural and/or other terminations for the report month
(R5) State implemented other mitigation strategies related to submission of renewal forms
(R6) Data reflect renewals and other eligibility and enrollment actions
(R7) Data represent Medicaid and CHIP enrollees whose renewal was completed in the reporting month
rather than the cohort due for renewal in the reporting month
(R8) Data undercount renewals and reflect these outcomes in another metric in the data set
(R9) Data include renewals held and reassigned to a future month
(R10) State only processed renewals in the report month that could be completed on an ex parte basis
This document includes Medicaid and CHIP data submitted to CMS by states regarding the number of renewals initiated and monthly renewal outcomes from the Unwinding Data Report and call center data from the Medicaid and CHIP Performance Indicator data set. These data include reporting metrics consistent with section 1902(tt)(1) of the Social Security Act and additional state-submitted renewal metrics.
Reporting on storage, performance, traffic, security, and event activity.
NOTE: To review the latest plan, make sure to filter the "Report Year" column to the latest year. This dataset includes various FOIL reporting metrics as part of the NYC Open Data Compliance Plan.
Almost nine in ten data center owners and operators said that they collected power consumption data in 2024, while around three-quarters compiled Power Usage Effectiveness (PUE) data. Data center operators are under increased pressure to publish such efficiency metrics, with artificial intelligence (AI) adoption prompting surging demand for data center capacity.
Attribution-NoDerivs 4.0 (CC BY-ND 4.0)https://creativecommons.org/licenses/by-nd/4.0/
License information was derived automatically
Visualizing institute based repository data re-usage trends
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset contains software metric and design pattern data for around 100,000 projects from the Maven Central repository. The data was collected and analyzed as part of my master's thesis "Mining Software Repositories for the Effects of Design Patterns on Software Quality" (https://www.overleaf.com/read/vnfhydqxmpvx, https://zenodo.org/record/4048275).
The included qualisign.* files all contain the same data in different formats: - qualisign.sql: standard SQL format (exported using "pg_dump --inserts ..."), - qualisign.psql: PostgreSQL plain format (exported using "pg_dump -Fp ..."), - qualisign.csql: PostgreSQL custom format (exported using "pg_dump -Fc ...").
create-tables.sql has to be executed before importing one of the qualisign.* files. Once qualisign.*sql has been imported, create-views.sql can be executed to preprocess the data, thereby creating materialized views that are more appropriate for data analysis purposes.
Software metrics were calculated using CKJM extended: http://gromit.iiar.pwr.wroc.pl/p_inf/ckjm/
Included software metrics are (21 total): - AMC: Average Method Complexity - CA: Afferent Coupling - CAM: Cohesion Among Methods - CBM: Coupling Between Methods - CBO: Coupling Between Objects - CC: Cyclomatic Complexity - CE: Efferent Coupling - DAM: Data Access Metric - DIT: Depth of Inheritance Tree - IC: Inheritance Coupling - LCOM: Lack of Cohesion of Methods (Chidamber and Kemerer) - LCOM3: Lack of Cohesion of Methods (Constantine and Graham) - LOC: Lines of Code - MFA: Measure of Functional Abstraction - MOA: Measure of Aggregation - NOC: Number of Children - NOM: Number of Methods - NOP: Number of Polymorphic Methods - NPM: Number of Public Methods - RFC: Response for Class - WMC: Weighted Methods per Class
In the qualisign.* data, these metrics are only available on the class level. create-views.sql additionally provides averages of these metrics on the package and project levels.
Design patterns were detected using SSA: https://users.encs.concordia.ca/~nikolaos/pattern_detection.html
Included design patterns are (15 total): - Adapter - Bridge - Chain of Responsibility - Command - Composite - Decorator - Factory Method - Observer - Prototype - Proxy - Singleton - State - Strategy - Template Method - Visitor
The code to generate the dataset is available at: https://github.com/jaichberg/qualisign
The code to perform quality analysis on the dataset is available at: https://github.com/jaichberg/qualisign-analysis
Prognostics performance evaluation has gained significant attention in the past few years. *As prognostics technology matures and more sophisticated methods for prognostic uncertainty management are developed, a standardized methodology for performance evaluation becomes extremely important to guide improvement efforts in a constructive manner. This paper is in continuation of previous efforts where several new evaluation metrics tailored for prognostics were introduced and were shown to effectively evaluate various algorithms as compared to other conventional metrics. Specifically, this paper presents a detailed discussion on how these metrics should be interpreted and used. Several shortcomings identified, while applying these metrics to a variety of real applications, are also summarized along with discussions that attempt to alleviate these problems. Further, these metrics have been enhanced to include the capability of incorporating probability distribution information from prognostic algorithms as opposed to evaluation based on point estimates only. Several methods have been suggested and guidelines have been provided to help choose one method over another based on probability distribution characteristics.
During a 2024 survey among marketing decision-makers worldwide, around ** percent of respondents reported tracking their businesses' marketing/sales pipeline as a key performance indicator (KPI). Approximately ** percent of the interviewees from business-to-consumer (B2C) brands selected marketing/sales funnel. Less than half of overall surveyed marketers included customer lifetime value among their KPIs.
This release is for quarters 1 to 4 of 2019 to 2020.
Local authority commissioners and health professionals can use these resources to track how many pregnant women, children and families in their local area have received health promoting reviews at particular points during pregnancy and childhood.
The data and commentaries also show variation at a local, regional and national level. This can help with planning, commissioning and improving local services.
The metrics cover health reviews for pregnant women, children and their families at several stages which are:
Public Health England (PHE) collects the data, which is submitted by local authorities on a voluntary basis.
See health visitor service delivery metrics in the child and maternal health statistics collection to access data for previous years.
Find guidance on using these statistics and other intelligence resources to help you make decisions about the planning and provision of child and maternal health services.
See health visitor service metrics and outcomes definitions from Community Services Dataset (CSDS).
Since publication in November 2020, Lewisham and Leicestershire councils have identified errors in the new birth visits within 14 days data it submitted to Public Health England (PHE) for 2019 to 2020 data. This error has caused a statistically significant change in the health visiting data for 2019 to 2020, and so the Office for Health Improvement and Disparities (OHID) has updated and reissued the data in OHID’s Fingertips tool.
A correction notice has been added to the 2019 to 2020 annual statistical release and statistical commentary but the data has not been altered.
Please consult OHID’s Fingertips tool for corrected data for Lewisham and Leicestershire, the London and East Midlands region, and England.