100+ datasets found
  1. Dataset for: Robust versus consistent variance estimators in marginal...

    • wiley.figshare.com
    pdf
    Updated May 30, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dirk Enders; Susanne Engel; Roland Linder; Iris Pigeot (2023). Dataset for: Robust versus consistent variance estimators in marginal structural Cox models [Dataset]. http://doi.org/10.6084/m9.figshare.6203456.v1
    Explore at:
    pdfAvailable download formats
    Dataset updated
    May 30, 2023
    Dataset provided by
    Wileyhttps://www.wiley.com/
    Authors
    Dirk Enders; Susanne Engel; Roland Linder; Iris Pigeot
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    In survival analyses, inverse-probability-of-treatment (IPT) and inverse-probability-of-censoring (IPC) weighted estimators of parameters in marginal structural Cox models (Cox MSMs) are often used to estimate treatment effects in the presence of time-dependent confounding and censoring. In most applications, a robust variance estimator of the IPT and IPC weighted estimator is calculated leading to conservative confidence intervals. This estimator assumes that the weights are known rather than estimated from the data. Although a consistent estimator of the asymptotic variance of the IPT and IPC weighted estimator is generally available, applications and thus information on the performance of the consistent estimator are lacking. Reasons might be a cumbersome implementation in statistical software, which is further complicated by missing details on the variance formula. In this paper, we therefore provide a detailed derivation of the variance of the asymptotic distribution of the IPT and IPC weighted estimator and explicitly state the necessary terms to calculate a consistent estimator of this variance. We compare the performance of the robust and the consistent variance estimator in an application based on routine health care data and in a simulation study. The simulation reveals no substantial differences between the two estimators in medium and large data sets with no unmeasured confounding, but the consistent variance estimator performs poorly in small samples or under unmeasured confounding, if the number of confounders is large. We thus conclude that the robust estimator is more appropriate for all practical purposes.

  2. H

    Script for calculate variance partition method

    • dataverse.harvard.edu
    • dataone.org
    Updated Sep 21, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Gabriela Alves-Ferreira (2022). Script for calculate variance partition method [Dataset]. http://doi.org/10.7910/DVN/SDXKGF
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Sep 21, 2022
    Dataset provided by
    Harvard Dataverse
    Authors
    Gabriela Alves-Ferreira
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    Script for calculate variance partition method and hierarchical partition method for scales regional and local

  3. Z

    _Attention what is it like [Dataset]

    • data.niaid.nih.gov
    Updated Mar 7, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dinis Pereira, Vitor Manuel (2021). _Attention what is it like [Dataset] [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_780412
    Explore at:
    Dataset updated
    Mar 7, 2021
    Dataset provided by
    LanCog Research Group, Universidade de Lisboa
    Authors
    Dinis Pereira, Vitor Manuel
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    R Core Team. (2016). R: A language and environment for statistical computing. R Foundation for Statistical Computing.

    Supplement to Occipital and left temporal instantaneous amplitude and frequency oscillations correlated with access and phenomenal consciousness (https://philpapers.org/rec/PEROAL-2).

    Occipital and left temporal instantaneous amplitude and frequency oscillations correlated with access and phenomenal consciousness move from the features of the ERP characterized in Occipital and Left Temporal EEG Correlates of Phenomenal Consciousness (Pereira, 2015, https://doi.org/10.1016/b978-0-12-802508-6.00018-1, https://philpapers.org/rec/PEROAL) towards the instantaneous amplitude and frequency of event-related changes correlated with a contrast in access and in phenomenology.

    Occipital and left temporal instantaneous amplitude and frequency oscillations correlated with access and phenomenal consciousness proceed as following.

    In the first section, empirical mode decomposition (EMD) with post processing (Xie, G., Guo, Y., Tong, S., and Ma, L., 2014. Calculate excess mortality during heatwaves using Hilbert-Huang transform algorithm. BMC medical research methodology, 14, 35) Ensemble Empirical Mode Decomposition (postEEMD) and Hilbert-Huang Transform (HHT).

    In the second section, calculated the variance inflation factor (VIF).

    In the third section, partial least squares regression (PLSR): the minimal root mean squared error of prediction (RMSEP).

    In the last section, partial least squares regression (PLSR): significance multivariate correlation (sMC) statistic.

  4. Dataset for: A noniterative sample size procedure for tests based on t...

    • wiley.figshare.com
    txt
    Updated Jun 1, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Yongqiang Tang (2023). Dataset for: A noniterative sample size procedure for tests based on t distributions [Dataset]. http://doi.org/10.6084/m9.figshare.6151220.v1
    Explore at:
    txtAvailable download formats
    Dataset updated
    Jun 1, 2023
    Dataset provided by
    Wileyhttps://www.wiley.com/
    Authors
    Yongqiang Tang
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    A noniterative sample size procedure is proposed for a general hypothesis test based on the t distribution by modifying and extending Guenther’s (1981) approach for the one sample and two sample t tests. The generalized procedure is employed to determine the sample size for treatment comparisons using the analysis of covariance (ANCOVA) and the mixed effects model for repeated measures (MMRM) in randomized clinical trials. The sample size is calculated by adding a few simple correction terms to the sample size from the normal approximation to account for the nonnormality of the t statistic and lower order variance terms, which are functions of the covariates in the model. But it does not require specifying the covariate distribution. The noniterative procedure is suitable for superiority tests, noninferiority tests and a special case of the tests for equivalence or bioequivalence, and generally yields the exact or nearly exact sample size estimate after rounding to an integer. The method for calculating the exact power of the two sample t test with unequal variance in superiority trials is extended to equivalence trials. We also derive accurate power formulae for ANCOVA and MMRM, and the formula for ANCOVA is exact for normally distributed covariates. Numerical examples demonstrate the accuracy of the proposed methods particularly in small samples.

  5. Low Variance Dataset

    • kaggle.com
    zip
    Updated Jun 17, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ezgi Turalı (2021). Low Variance Dataset [Dataset]. https://www.kaggle.com/ezgitural/low-variance-dataset
    Explore at:
    zip(422949 bytes)Available download formats
    Dataset updated
    Jun 17, 2021
    Authors
    Ezgi Turalı
    Description

    Context

    I needed a low variance dataset for my project to make a point. I could not find it in here. So, I got it somehow and there you go!

  6. d

    Data from: Host nutrition alters the variance in parasite transmission...

    • datadryad.org
    • datasetcatalog.nlm.nih.gov
    • +2more
    zip
    Updated Feb 13, 2013
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Pedro F. Vale; Marc Choisy; Tom J. Little (2013). Host nutrition alters the variance in parasite transmission potential [Dataset]. http://doi.org/10.5061/dryad.f1338
    Explore at:
    zipAvailable download formats
    Dataset updated
    Feb 13, 2013
    Dataset provided by
    Dryad
    Authors
    Pedro F. Vale; Marc Choisy; Tom J. Little
    Time period covered
    Jan 14, 2013
    Description

    Raw dataData includes both infected and non-infected hosts. Note that, except for the analysis of susceptibility, only infected individuals were included in the analyses. These data are a subset of a larger dataset published previously in Vale PF, Wilson AJ, Best A, Boots M, Little TJ. (2011) Epidemiological, evolutionary, and coevolutionary implications of context-dependent parasitism. The American Naturalist 177:510-521.

  7. r

    Origin of Variances in the Oldest-Old: Octogenarian Twins (OCTO-Twin) Wave 5...

    • researchdata.se
    • demo.researchdata.se
    • +1more
    Updated Apr 3, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Linda Hassing (2023). Origin of Variances in the Oldest-Old: Octogenarian Twins (OCTO-Twin) Wave 5 [Dataset]. https://researchdata.se/en/catalogue/dataset/2021-195-5
    Explore at:
    Dataset updated
    Apr 3, 2023
    Dataset provided by
    University of Gothenburg
    Authors
    Linda Hassing
    Area covered
    Sweden
    Description

    The OCTO-Twin Study aims to investigate the etiology of individual differences among twin-pairs age 80 and older, on a range of domains including health and functional capacity, cognitive functioning, psychological well-being, personality and personal control. In the study, twin pairs were withdrawn from the Swedish Twin Registry. At the first wave, the twins had to be born 1913 or earlier and both partners in the pair had to accept participation. At baseline in 1991-94, 351 twin pairs (149 monozygotic and 202 like-sex dizygotic pairs) were investigated (mean age: 83.6 years and 67% were female). The two-year longitudinal follow-ups were conducted on all twins who were alive and agreed to participate. Data have been collected at five waves over a total of eight years.

    In wave 5, 43 twin pairs participated, with a total of 222 individuals. Refer to the description of wave 1/the base line and the individual datasets in the NEAR portal for more details on variable groups and individual variables.

  8. H

    Multi-Laboratory Hematoxylin and Eosin Staining Variance Supervised Machine...

    • datasetcatalog.nlm.nih.gov
    • dataverse.harvard.edu
    • +1more
    Updated Nov 4, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ruusuvuori, Pekka; Äyrämö, Sami; Pölönen, Ilkka; Prezja, Fabi; Kuopio, Teijo (2022). Multi-Laboratory Hematoxylin and Eosin Staining Variance Supervised Machine Learning Dataset [Dataset]. http://doi.org/10.7910/DVN/5YNF3B
    Explore at:
    Dataset updated
    Nov 4, 2022
    Authors
    Ruusuvuori, Pekka; Äyrämö, Sami; Pölönen, Ilkka; Prezja, Fabi; Kuopio, Teijo
    Description

    We provide the generated dataset used for supervised machine learning in the related article. The data are in tabular format and contain all principal components and ground truth labels per tissue type. Tissue type codes used are; C1 for kidney, C2 for skin, and C3 for colon. 'PC' stands for the principal component. For feature extraction specifications, please see the original design in the related article. Features have been extracted independently for each tissue type.

  9. Stock Portfolio Optimization Dataset for Efficient

    • kaggle.com
    zip
    Updated Aug 30, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Emmanuel Ochiba (2023). Stock Portfolio Optimization Dataset for Efficient [Dataset]. https://www.kaggle.com/datasets/chibss/stock-dataset-for-portfolio-optimization
    Explore at:
    zip(8610 bytes)Available download formats
    Dataset updated
    Aug 30, 2023
    Authors
    Emmanuel Ochiba
    Description

    This dataset has been meticulously curated to assist investment analysts, like you, in performing mean-variance optimization for constructing efficient portfolios. The dataset contains historical financial data for a selection of assets, enabling the calculation of risk and return characteristics necessary for portfolio optimization. The goal is to help you determine the most effective allocation of assets to achieve optimal risk-return trade-offs.

  10. d

    Data from: Selective increases in inter-individual variability in response...

    • datadryad.org
    • datasetcatalog.nlm.nih.gov
    • +3more
    zip
    Updated Oct 29, 2018
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Julia C. Korholz; Sara Zocher; Anna N. Grzyb; Benjamin Morisse; Alexandra Poetzsch; Fanny Ehret; Christopher Schmied; Gerd Kempermann (2018). Selective increases in inter-individual variability in response to environmental enrichment in female mice [Dataset]. http://doi.org/10.5061/dryad.12cm083
    Explore at:
    zipAvailable download formats
    Dataset updated
    Oct 29, 2018
    Dataset provided by
    Dryad
    Authors
    Julia C. Korholz; Sara Zocher; Anna N. Grzyb; Benjamin Morisse; Alexandra Poetzsch; Fanny Ehret; Christopher Schmied; Gerd Kempermann
    Time period covered
    Feb 22, 2018
    Area covered
    Not applicable
    Description

    Supplementary File1_phenotypesThe txt file contains the phenotypes assessed in our study for all mice under control (CTRL) or enriched (conditions). The file is a txt file, comma delimited. Eache mouse is one line. All abbreviations and the phenotypes are explained in the article.

  11. H

    Data from: Estimation across Data Sets: Two-Stage Auxiliary Instrumental...

    • dataverse.harvard.edu
    Updated Dec 21, 2009
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Charles H. Franklin (2009). Estimation across Data Sets: Two-Stage Auxiliary Instrumental Variables Estimation (2SAIV) [Dataset]. http://doi.org/10.7910/DVN/HL5YUY
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Dec 21, 2009
    Dataset provided by
    Harvard Dataverse
    Authors
    Charles H. Franklin
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    Theories demand much of data, often more than a single data collection can provide. For example, many important research questions are set in the past and must rely on data collected at that time and for other purposes. As a result, we often find that the data lack crucial variables. Another common problem arises when we wish to estimate the relationship between variables that are measured in different data sets. A variation of this occurs with a split half sample design in which one or more important variables appear on the "wrong" half. Finally, we may need panel data but have only cross sections available. In each of these cases our ability to estimate the theoretically determined equation is limited by the data that are available. In many cases there is simply no solution, and theory must await new opportunities for testing. Under certain circumstances, however, we may still be able to estimate relationships between variables even though they are not measured on the same set of observations. This technique, which I call two-stage auxiliary instrumental variables (2SAIV), provides some new leverage on such problems and offers the opportunity to test hypotheses that were previously out of reach. T his article develops the 2SAIV estimator, proves its consistency and derives its asymptotic variance. A set of simulations illustrates the performance of the estimator in finite samples and several applications are sketched out.

  12. d

    Data from: Meta-analysis of variance: an illustration comparing the effects...

    • datadryad.org
    • data.niaid.nih.gov
    • +1more
    zip
    Updated Jul 15, 2017
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alistair M. Senior; Alison K. Gosby; Jing Lu; Stephen J. Simpson; David Raubenheimer (2017). Meta-analysis of variance: an illustration comparing the effects of two dietary interventions on variability in weight [Dataset]. http://doi.org/10.5061/dryad.337dr
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jul 15, 2017
    Dataset provided by
    Dryad
    Authors
    Alistair M. Senior; Alison K. Gosby; Jing Lu; Stephen J. Simpson; David Raubenheimer
    Time period covered
    Jul 13, 2016
    Description

    Data PackageThis is the data package to accompany Senior et al. 2016: Meta-analysis of variance: an illustration comparing the effects of two dietary interventions on variability in weight. The zip file contains three R files. 1) "Analysis.R" is the script to perform the analysis and is heavily commented. 2) "Data.Objects.Rdata" is a R datafile containing all of the necessary to replicate the analysis (see comments in "Analysis.R"). 3) "EffectSizeFunctions.R" contains various functions to calculate effect sizes, and is called in "Analysis.R".

  13. IMP-8 Weimer Propagation Details at 1 min Resolution - Dataset - NASA Open...

    • data.nasa.gov
    Updated Apr 1, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    nasa.gov (2025). IMP-8 Weimer Propagation Details at 1 min Resolution - Dataset - NASA Open Data Portal [Dataset]. https://data.nasa.gov/dataset/imp-8-weimer-propagation-details-at-1-min-resolution
    Explore at:
    Dataset updated
    Apr 1, 2025
    Dataset provided by
    NASAhttp://nasa.gov/
    Description

    IMP-8 Weimer propagated solar wind data and linearly interpolated time delay, cosine angle, and goodness information of propagated data at 1 min Resolution. This data set consists of propagated solar wind data that has first been propagated to a position just outside of the nominal bow shock (about 17, 0, 0 Re) and then linearly interpolated to 1 min resolution using the interp1.m function in MATLAB. The input data for this data set is a 1 min resolution processed solar wind data constructed by Dr. J.M. Weygand. The method of propagation is similar to the minimum variance technique and is outlined in Dan Weimer et al. [2003; 2004]. The basic method is to find the minimum variance direction of the magnetic field in the plane orthogonal to the mean magnetic field direction. This minimum variance direction is then dotted with the difference between final position vector minus the original position vector and the quantity is divided by the minimum variance dotted with the solar wind velocity vector, which gives the propagation time. This method does not work well for shocks and minimum variance directions with tilts greater than 70 degrees of the sun-earth line. This data set was originally constructed by Dr. J.M. Weygand for Prof. R.L. McPherron, who was the principle investigator of two National Science Foundation studies: GEM Grant ATM 02-1798 and a Space Weather Grant ATM 02-08501. These data were primarily used in superposed epoch studies References: Weimer, D. R. (2004), Correction to ‘‘Predicting interplanetary magnetic field (IMF) propagation delay times using the minimum variance technique,’’ J. Geophys. Res., 109, A12104, doi:10.1029/2004JA010691. Weimer, D.R., D.M. Ober, N.C. Maynard, M.R. Collier, D.J. McComas, N.F. Ness, C. W. Smith, and J. Watermann (2003), Predicting interplanetary magnetic field (IMF) propagation delay times using the minimum variance technique, J. Geophys. Res., 108, 1026, doi:10.1029/2002JA009405.

  14. d

    Data from: Submerged macrophytes affect the temporal variability of aquatic...

    • datadryad.org
    • data.niaid.nih.gov
    zip
    Updated Oct 28, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Moritz Lürig; Rebecca Best; Vasilis Dakos; Blake Matthews (2020). Submerged macrophytes affect the temporal variability of aquatic ecosystems [Dataset]. http://doi.org/10.5061/dryad.18931zcvt
    Explore at:
    zipAvailable download formats
    Dataset updated
    Oct 28, 2020
    Dataset provided by
    Dryad
    Authors
    Moritz Lürig; Rebecca Best; Vasilis Dakos; Blake Matthews
    Time period covered
    Oct 27, 2020
    Description
    1. Submerged macrophytes are important foundation species that can strongly influence the structure and functioning of aquatic ecosystems, but only little is known about the temporal variation and the timescales of these effects (i.e. from hourly, daily, to monthly).

    2. Here, we conducted an outdoor experiment in replicated mesocosms (1000 L) where we manipulated the presence and absence of macrophytes to investigate the temporal variability of their ecosystem effects. We measured several parameters (chlorophyll-a, phycocyanin, dissolved organic matter [DOM], and oxygen) with high-resolution sensors (15 min intervals) over several months (94 days from spring to fall), and modelled metabolic rates of each replicate ecosystem in a Bayesian framework. We also implemented a simple model to explore competitive interactions between phytoplankton and macrophytes as a driver of variability in chlorophyll-a.

    3. Over the entire experiment, macrophytes had a positive effect on mean DOM concentra...

  15. f

    Subfunctions for calculating variance.

    • datasetcatalog.nlm.nih.gov
    • plos.figshare.com
    Updated May 16, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jia, Xiaoyan; Zhang, Qinghui; Zhang, Meilin; Ding, Yang; LI, Junqiu; Jin, Yiting (2024). Subfunctions for calculating variance. [Dataset]. https://datasetcatalog.nlm.nih.gov/dataset?q=0001274866
    Explore at:
    Dataset updated
    May 16, 2024
    Authors
    Jia, Xiaoyan; Zhang, Qinghui; Zhang, Meilin; Ding, Yang; LI, Junqiu; Jin, Yiting
    Description

    The analysis of critical states during fracture of wood materials is crucial for wood building safety monitoring, wood processing, etc. In this paper, beech and camphor pine are selected as the research objects, and the acoustic emission signals during the fracture process of the specimens are analyzed by three-point bending load experiments. On the one hand, the critical state interval of a complex acoustic emission signal system is determined by selecting characteristic parameters in the natural time domain. On the other hand, an improved method of b_value analysis in the natural time domain is proposed based on the characteristics of the acoustic emission signal. The K-value, which represents the beginning of the critical state of a complex acoustic emission signal system, is further defined by the improved method of b_value in the natural time domain. For beech, the analysis of critical state time based on characteristic parameters can predict the “collapse” time 8.01 s in advance, while for camphor pines, 3.74 s in advance. K-value can be analyzed at least 3 s in advance of the system “crash” time for beech and 4 s in advance of the system “crash” time for camphor pine. The results show that compared with traditional time-domain acoustic emission signal analysis, natural time-domain acoustic emission signal analysis can discover more available feature information to characterize the state of the signal. Both the characteristic parameters and Natural_Time_b_value analysis in the natural time domain can effectively characterize the time when the complex acoustic emission signal system enters the critical state. Critical state analysis can provide new ideas for wood health monitoring and complex signal processing, etc.

  16. ACE Solar Wind Weimer Propagation Details at 1 min Resolution - Dataset -...

    • data.nasa.gov
    Updated Aug 21, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    nasa.gov (2025). ACE Solar Wind Weimer Propagation Details at 1 min Resolution - Dataset - NASA Open Data Portal [Dataset]. https://data.nasa.gov/dataset/ace-solar-wind-weimer-propagation-details-at-1-min-resolution
    Explore at:
    Dataset updated
    Aug 21, 2025
    Dataset provided by
    NASAhttp://nasa.gov/
    Description

    ACE Weimer propagated solar wind data and linearly interpolated time delay, cosine angle, and goodness information of propagated data at 1 min Resolution. This data set consists of propagated solar wind data that has first been propagated to a position just outside of the nominal bow shock (about 17, 0, 0 Re) and then linearly interpolated to 1 min resolution using the interp1.m function in MATLAB. The input data for this data set is a 1 min resolution processed solar wind data constructed by Dr. J.M. Weygand. The method of propagation is similar to the minimum variance technique and is outlined in Dan Weimer et al. [2003; 2004]. The basic method is to find the minimum variance direction of the magnetic field in the plane orthogonal to the mean magnetic field direction. This minimum variance direction is then dotted with the difference between final position vector minus the original position vector and the quantity is divided by the minimum variance dotted with the solar wind velocity vector, which gives the propagation time. This method does not work well for shocks and minimum variance directions with tilts greater than 70 degrees of the sun-earth line. This data set was originally constructed by Dr. J.M. Weygand for Prof. R.L. McPherron, who was the principle investigator of two National Science Foundation studies: GEM Grant ATM 02-1798 and a Space Weather Grant ATM 02-08501. These data were primarily used in superposed epoch studies References: Weimer, D. R. (2004), Correction to ‘‘Predicting interplanetary magnetic field (IMF) propagation delay times using the minimum variance technique,’’ J. Geophys. Res., 109, A12104, doi:10.1029/2004JA010691. Weimer, D.R., D.M. Ober, N.C. Maynard, M.R. Collier, D.J. McComas, N.F. Ness, C. W. Smith, and J. Watermann (2003), Predicting interplanetary magnetic field (IMF) propagation delay times using the minimum variance technique, J. Geophys. Res., 108, 1026, doi:10.1029/2002JA009405.

  17. ISEE 1 Solar Wind Weimer Propagation Details at 1 min Resolution - Dataset -...

    • data.nasa.gov
    Updated Apr 1, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    nasa.gov (2025). ISEE 1 Solar Wind Weimer Propagation Details at 1 min Resolution - Dataset - NASA Open Data Portal [Dataset]. https://data.nasa.gov/dataset/isee-1-solar-wind-weimer-propagation-details-at-1-min-resolution
    Explore at:
    Dataset updated
    Apr 1, 2025
    Dataset provided by
    NASAhttp://nasa.gov/
    Description

    ISEE-1 Weimer propagated solar wind data and linearly interpolated time delay, cosine angle, and goodness information of propagated data at 1 min Resolution. This data set consists of propagated solar wind data that has first been propagated to a position just outside of the nominal bow shock (about 17, 0, 0 Re) and then linearly interpolated to 1 min resolution using the interp1.m function in MATLAB. The input data for this data set is a 1 min resolution processed solar wind data constructed by Dr. J.M. Weygand. The method of propagation is similar to the minimum variance technique and is outlined in Dan Weimer et al. [2003; 2004]. The basic method is to find the minimum variance direction of the magnetic field in the plane orthogonal to the mean magnetic field direction. This minimum variance direction is then dotted with the difference between final position vector minus the original position vector and the quantity is divided by the minimum variance dotted with the solar wind velocity vector, which gives the propagation time. This method does not work well for shocks and minimum variance directions with tilts greater than 70 degrees of the sun-earth line. This data set was originally constructed by Dr. J.M. Weygand for Prof. R.L. McPherron, who was the principle investigator of two National Science Foundation studies: GEM Grant ATM 02-1798 and a Space Weather Grant ATM 02-08501. These data were primarily used in superposed epoch studies References: Weimer, D. R. (2004), Correction to ‘‘Predicting interplanetary magnetic field (IMF) propagation delay times using the minimum variance technique,’’ J. Geophys. Res., 109, A12104, doi:10.1029/2004JA010691. Weimer, D.R., D.M. Ober, N.C. Maynard, M.R. Collier, D.J. McComas, N.F. Ness, C. W. Smith, and J. Watermann (2003), Predicting interplanetary magnetic field (IMF) propagation delay times using the minimum variance technique, J. Geophys. Res., 108, 1026, doi:10.1029/2002JA009405.

  18. Geotail Comprehensive Plasma Instrumentation (CPI) data Weimer Propagated 60...

    • data.nasa.gov
    Updated Aug 21, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    nasa.gov (2025). Geotail Comprehensive Plasma Instrumentation (CPI) data Weimer Propagated 60 s Resolution in GSE Coordinates - Dataset - NASA Open Data Portal [Dataset]. https://data.nasa.gov/dataset/geotail-comprehensive-plasma-instrumentation-cpi-data-weimer-propagated-60-s-resolution-in
    Explore at:
    Dataset updated
    Aug 21, 2025
    Dataset provided by
    NASAhttp://nasa.gov/
    Description

    Geotail Weimer propagated solar wind data and linearly interpolated to have the measurements on the minute at 60 s resolution CPI data in GSE coordinates. This data set consists of propagated solar wind data that has first been propagated to a position just outside of the nominal bow shock (about 17, 0, 0 Re) and then linearly interpolated to 1 min resolution using the interp1.m function in MATLAB. The input data for this data set is a 1 min resolution processed solar wind data constructed by Dr. J.M. Weygand. The method of propagation is similar to the minimum variance technique and is outlined in Dan Weimer et al. [2003; 2004]. The basic method is to find the minimum variance direction of the magnetic field in the plane orthogonal to the mean magnetic field direction. This minimum variance direction is then dotted with the difference between final position vector minus the original position vector and the quantity is divided by the minimum variance dotted with the solar wind velocity vector, which gives the propagation time. This method does not work well for shocks and minimum variance directions with tilts greater than 70 degrees of the sun-earth line. This data set was originally constructed by Dr. J.M. Weygand for Prof. R.L. McPherron, who was the principle investigator of two National Science Foundation studies: GEM Grant ATM 02-1798 and a Space Weather Grant ATM 02-08501. These data were primarily used in superposed epoch studies References: Weimer, D. R. (2004), Correction to ‘‘Predicting interplanetary magnetic field (IMF) propagation delay times using the minimum variance technique,’’ J. Geophys. Res., 109, A12104, doi:10.1029/2004JA010691. Weimer, D.R., D.M. Ober, N.C. Maynard, M.R. Collier, D.J. McComas, N.F. Ness, C. W. Smith, and J. Watermann (2003), Predicting interplanetary magnetic field (IMF) propagation delay times using the minimum variance technique, J. Geophys. Res., 108, 1026, doi:10.1029/2002JA009405.

  19. f

    The data output from the analysis of variance (+Tukey's post hoc tests) to...

    • datasetcatalog.nlm.nih.gov
    • figshare.com
    Updated Jun 26, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Okumu, Mitchel (2020). The data output from the analysis of variance (+Tukey's post hoc tests) to determine the differences in the mean protein content of Naja ashei venom and antivenom [Dataset]. https://datasetcatalog.nlm.nih.gov/dataset?q=0000504816
    Explore at:
    Dataset updated
    Jun 26, 2020
    Authors
    Okumu, Mitchel
    Description

    This dataset contains the following: 1. ANOVA table (variate:protein content)2. Table of effects3. Table of means4. Standard errors of differences of means 5. Tukey's 95% confidence intervals

  20. Dataset for: Power analysis for multivariable Cox regression models

    • wiley.figshare.com
    • search.datacite.org
    txt
    Updated May 31, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Emil Scosyrev; Ekkehard Glimm (2023). Dataset for: Power analysis for multivariable Cox regression models [Dataset]. http://doi.org/10.6084/m9.figshare.7010483.v1
    Explore at:
    txtAvailable download formats
    Dataset updated
    May 31, 2023
    Dataset provided by
    Wileyhttps://www.wiley.com/
    Authors
    Emil Scosyrev; Ekkehard Glimm
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    In power analysis for multivariable Cox regression models, variance of the estimated log-hazard ratio for the treatment effect is usually approximated by inverting the expected null information matrix. Because in many typical power analysis settings assumed true values of the hazard ratios are not necessarily close to unity, the accuracy of this approximation is not theoretically guaranteed. To address this problem, the null variance expression in power calculations can be replaced with one of alternative expressions derived under the assumed true value of the hazard ratio for the treatment effect. This approach is explored analytically and by simulations in the present paper. We consider several alternative variance expressions, and compare their performance to that of the traditional null variance expression. Theoretical analysis and simulations demonstrate that while the null variance expression performs well in many non-null settings, it can also be very inaccurate, substantially underestimating or overestimating the true variance in a wide range of realistic scenarios, particularly those where the numbers of treated and control subjects are very different and the true hazard ratio is not close to one. The alternative variance expressions have much better theoretical properties, confirmed in simulations. The most accurate of these expressions has a relatively simple form - it is the sum of inverse expected event counts under treatment and under control scaled up by a variance inflation factor.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Dirk Enders; Susanne Engel; Roland Linder; Iris Pigeot (2023). Dataset for: Robust versus consistent variance estimators in marginal structural Cox models [Dataset]. http://doi.org/10.6084/m9.figshare.6203456.v1
Organization logo

Dataset for: Robust versus consistent variance estimators in marginal structural Cox models

Related Article
Explore at:
pdfAvailable download formats
Dataset updated
May 30, 2023
Dataset provided by
Wileyhttps://www.wiley.com/
Authors
Dirk Enders; Susanne Engel; Roland Linder; Iris Pigeot
License

CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically

Description

In survival analyses, inverse-probability-of-treatment (IPT) and inverse-probability-of-censoring (IPC) weighted estimators of parameters in marginal structural Cox models (Cox MSMs) are often used to estimate treatment effects in the presence of time-dependent confounding and censoring. In most applications, a robust variance estimator of the IPT and IPC weighted estimator is calculated leading to conservative confidence intervals. This estimator assumes that the weights are known rather than estimated from the data. Although a consistent estimator of the asymptotic variance of the IPT and IPC weighted estimator is generally available, applications and thus information on the performance of the consistent estimator are lacking. Reasons might be a cumbersome implementation in statistical software, which is further complicated by missing details on the variance formula. In this paper, we therefore provide a detailed derivation of the variance of the asymptotic distribution of the IPT and IPC weighted estimator and explicitly state the necessary terms to calculate a consistent estimator of this variance. We compare the performance of the robust and the consistent variance estimator in an application based on routine health care data and in a simulation study. The simulation reveals no substantial differences between the two estimators in medium and large data sets with no unmeasured confounding, but the consistent variance estimator performs poorly in small samples or under unmeasured confounding, if the number of confounders is large. We thus conclude that the robust estimator is more appropriate for all practical purposes.

Search
Clear search
Close search
Google apps
Main menu