37 datasets found
  1. f

    Breakdown of Methods Used to Combine -values Investigated.

    • figshare.com
    xls
    Updated Jun 5, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Gelio Alves; Yi-Kuo Yu (2023). Breakdown of Methods Used to Combine -values Investigated. [Dataset]. http://doi.org/10.1371/journal.pone.0091225.t001
    Explore at:
    xlsAvailable download formats
    Dataset updated
    Jun 5, 2023
    Dataset provided by
    PLOS ONE
    Authors
    Gelio Alves; Yi-Kuo Yu
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    The first column of the table provides the names of the methods used to combine -values investigated in our study. The second column lists the reference number cited in this paper for the publication (Ref) corresponding to the method used. The third column provides the equation number for the method distribution function used to compute the formula -value. The fourth column indicates if a method equation can accommodate (acc.) weight when combining -value. The fifth column gives the normalization (nor.) procedure used to normalize the weights. Finally, the last column conveys the information about a method's capability to account for correlation (corr.) between -values.

  2. t

    A China's normalized tree biomass equation dataset

    • service.tib.eu
    • doi.pangaea.de
    Updated Nov 30, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2024). A China's normalized tree biomass equation dataset [Dataset]. https://service.tib.eu/ldmservice/dataset/png-doi-10-1594-pangaea-895244
    Explore at:
    Dataset updated
    Nov 30, 2024
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    China
    Description

    The dataset is originated, conceived, designed and maintained by Xiaoke WANG, Zhiyun OUYANG and Yunjian LUO. To develop the China's normalized tree biomass equation dataset, we carried out an extensive survey and critical review of the literature (from 1978 to 2013) on biomass equations conducted in China. It consists of 5924 biomass equations for nearly 200 species (Equation sheet) and their associated background information (General sheet), showing sound geographical, climatic and forest vegetation coverages across China. The dataset is freely available for non-commercial scientific applications, provided it is appropriately cited. For further information, please read our Earth System Science Data article (https://doi.org/10.5194/essd-2019-1), or feel free to contact the authors.

  3. Left ventricular mass is underestimated in overweight children because of...

    • plos.figshare.com
    txt
    Updated May 31, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Hubert Krysztofiak; Marcel Młyńczak; Łukasz A. Małek; Andrzej Folga; Wojciech Braksator (2023). Left ventricular mass is underestimated in overweight children because of incorrect body size variable chosen for normalization [Dataset]. http://doi.org/10.1371/journal.pone.0217637
    Explore at:
    txtAvailable download formats
    Dataset updated
    May 31, 2023
    Dataset provided by
    PLOShttp://plos.org/
    Authors
    Hubert Krysztofiak; Marcel Młyńczak; Łukasz A. Małek; Andrzej Folga; Wojciech Braksator
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    BackgroundLeft ventricular mass normalization for body size is recommended, but a question remains: what is the best body size variable for this normalization—body surface area, height or lean body mass computed based on a predictive equation? Since body surface area and computed lean body mass are derivatives of body mass, normalizing for them may result in underestimation of left ventricular mass in overweight children. The aim of this study is to indicate which of the body size variables normalize left ventricular mass without underestimating it in overweight children.MethodsLeft ventricular mass assessed by echocardiography, height and body mass were collected for 464 healthy boys, 5–18 years old. Lean body mass and body surface area were calculated. Left ventricular mass z-scores computed based on reference data, developed for height, body surface area and lean body mass, were compared between overweight and non-overweight children. The next step was a comparison of paired samples of expected left ventricular mass, estimated for each normalizing variable based on two allometric equations—the first developed for overweight children, the second for children of normal body mass.ResultsThe mean of left ventricular mass z-scores is higher in overweight children compared to non-overweight children for normative data based on height (0.36 vs. 0.00) and lower for normative data based on body surface area (-0.64 vs. 0.00). Left ventricular mass estimated normalizing for height, based on the equation for overweight children, is higher in overweight children (128.12 vs. 118.40); however, masses estimated normalizing for body surface area and lean body mass, based on equations for overweight children, are lower in overweight children (109.71 vs. 122.08 and 118.46 vs. 120.56, respectively).ConclusionNormalization for body surface area and for computed lean body mass, but not for height, underestimates left ventricular mass in overweight children.

  4. C

    Source code for the calculation of the re-normalized mean resultant length

    • dataverse.csuc.cat
    text/x-matlab, txt
    Updated Sep 28, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ralph Gregor Andrzejak; Ralph Gregor Andrzejak; Anaïs Espinoso; Anaïs Espinoso; Eduardo García-Portugués; Eduardo García-Portugués; Arthur Pewsey; Arthur Pewsey; Jacopo Epifanio; Jacopo Epifanio; Marc G. Leguia; Marc G. Leguia; Kaspar Schindler; Kaspar Schindler (2023). Source code for the calculation of the re-normalized mean resultant length [Dataset]. http://doi.org/10.34810/data845
    Explore at:
    text/x-matlab(4211), text/x-matlab(2290), text/x-matlab(770), txt(3846), text/x-matlab(1438)Available download formats
    Dataset updated
    Sep 28, 2023
    Dataset provided by
    CORA.Repositori de Dades de Recerca
    Authors
    Ralph Gregor Andrzejak; Ralph Gregor Andrzejak; Anaïs Espinoso; Anaïs Espinoso; Eduardo García-Portugués; Eduardo García-Portugués; Arthur Pewsey; Arthur Pewsey; Jacopo Epifanio; Jacopo Epifanio; Marc G. Leguia; Marc G. Leguia; Kaspar Schindler; Kaspar Schindler
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This source code was published as supporting material for the article: Ralph G. Andrzejak, Anaïs Espinoso, Eduardo García-Portugués, Arthur Pewsey, Jacopo Epifanio, Marc G. Leguia, Kaspar Schindler; High expectations on phase locking: Better quantifying the concentration of circular data. Chaos (2023); 33 (9): 091106. https://doi.org/10.1063/5.0166468

  5. m

    Data from: POINCARÉ CODE: A package of open-source implements for...

    • data.mendeley.com
    Updated Sep 1, 2013
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    J. Mikram (2013). POINCARÉ CODE: A package of open-source implements for normalization and computer algebra reduction near equilibria of coupled ordinary differential equations [Dataset]. http://doi.org/10.17632/tsyg3k6khh.1
    Explore at:
    Dataset updated
    Sep 1, 2013
    Authors
    J. Mikram
    License

    https://www.elsevier.com/about/policies/open-access-licenses/elsevier-user-license/cpc-license/https://www.elsevier.com/about/policies/open-access-licenses/elsevier-user-license/cpc-license/

    Description

    Abstract The Poincaré code is a Maple project package that aims to gather significant computer algebra normal form (and subsequent reduction) methods for handling nonlinear ordinary differential equations. As a first version, a set of fourteen easy-to-use Maple commands is introduced for symbolic creation of (improved variants of Poincaré’s) normal forms as well as their associated normalizing transformations. The software is the implementation by the authors of carefully studied and followed up sele...

    Title of program: POINCARÉ Catalogue Id: AEPJ_v1_0

    Nature of problem Computing structure-preserving normal forms near the origin for nonlinear vector fields.

    Versions of this program held in the CPC repository in Mendeley Data AEPJ_v1_0; POINCARÉ; 10.1016/j.cpc.2013.04.003

    This program has been imported from the CPC Program Library held at Queen's University Belfast (1969-2018)

  6. a

    Estimation of Normalized Profit Function and Factor Share Equation for...

    • afrischolarrepository.net.ng
    Updated Jan 26, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2024). Estimation of Normalized Profit Function and Factor Share Equation for Cassava-based Farmers in Odukpani L.G.A., Cross River State. - Dataset - Afrischolar Discovery Initiative (ADI) [Dataset]. https://afrischolarrepository.net.ng/dataset/estimation-of-normalized-profit-functi
    Explore at:
    Dataset updated
    Jan 26, 2024
    License

    Attribution-NonCommercial 2.0 (CC BY-NC 2.0)https://creativecommons.org/licenses/by-nc/2.0/
    License information was derived automatically

    Area covered
    Cross River, Odukpani
    Description

    International Journal of Social Studies and public policy

  7. f

    Normalized lift LN equations for different lift-generating systems.

    • figshare.com
    xls
    Updated May 30, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Phillip Burgers; David E. Alexander (2023). Normalized lift LN equations for different lift-generating systems. [Dataset]. http://doi.org/10.1371/journal.pone.0036732.t001
    Explore at:
    xlsAvailable download formats
    Dataset updated
    May 30, 2023
    Dataset provided by
    PLOS ONE
    Authors
    Phillip Burgers; David E. Alexander
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Normalized lift LN equations for different lift-generating systems.

  8. Data from: Estimating global transpiration from TROPOMI SIF with angular...

    • zenodo.org
    Updated Jan 21, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Chen Zheng; Chen Zheng (2025). Estimating global transpiration from TROPOMI SIF with angular normalization and separation for sunlit and shaded leaves [Dataset]. http://doi.org/10.5281/zenodo.14211029
    Explore at:
    Dataset updated
    Jan 21, 2025
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Chen Zheng; Chen Zheng
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Time period covered
    Jan 1, 2019
    Description

    All three types of SIF-driven T models integrate canopy conductance (gc) with the Penman-Monteith model, differing in how gc is derived: from a SIFobs driven semi-mechanistic equation, a SIFsunlit and SIFshaded driven semi-mechanistic equation, and a SIFsunlit and SIFshaded driven machine learning model.

    The difference between a simplified SIF-gc equation and a SIF-gc equation is the treatment of some parameters and is shown in https://doi.org/10.1016/j.rse.2024.114586.

    In this dataset, the temporal resolution is 1 day, and the spatial resolution is 0.2 degree.

    BL: SIFobs driven semi-mechanistic model

    TL: SIFsunlit and SIFshaded driven semi-mechanistic model

    hybrid models: SIFsunlit and SIFshaded driven machine learning model.

  9. s

    Normalized Difference Water Index (NDWI) - Annual Mean - Switzerland

    • geonetwork.swissdatacube.org
    doi +1
    Updated Sep 17, 2019
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Université de Genève (2019). Normalized Difference Water Index (NDWI) - Annual Mean - Switzerland [Dataset]. https://geonetwork.swissdatacube.org/geonetwork/srv/api/records/1008ba03-a57d-42d0-b7d7-3a861d91c4be
    Explore at:
    doi, ogc:wms-1.3.0-http-get-capabilitiesAvailable download formats
    Dataset updated
    Sep 17, 2019
    Dataset authored and provided by
    Université de Genève
    Time period covered
    Mar 28, 1984 - Jun 15, 2019
    Area covered
    Description

    This dataset is an annual time-serie of Landsat Analysis Ready Data (ARD)-derived Normalized Difference Water Index (NDWI) computed from Landsat 5 Thematic Mapper (TM) and Landsat 8 Opeational Land Imager (OLI). To ensure a consistent dataset, Landsat 7 has not been used because the Scan Line Correct (SLC) failure creates gaps into the data. NDWI quantifies plant water content by measuring the difference between Near-Infrared (NIR) and Short Wave Infrared (SWIR) (or Green) channels using this generic formula: (NIR - SWIR) / (NIR + SWIR) For Landsat sensors, this corresponds to the following bands: Landsat 5, NDVI = (Band 4 – Band 2) / (Band 4 + Band 2). Landsat 8, NDVI = (Band 5 – Band 3) / (Band 5 + Band 3). NDWI values ranges from -1 to +1. NDWI is a good proxy for plant water stress and therefore useful for drought monitoring and early warning. NDWI is sometimes alos refered as Normalized Difference Moisture Index (NDMI) Standard Deviation is also provided for each time step. Data format: GeoTiff This dataset has been genereated with the Swiss Data Cube (http://www.swissdatacube.ch)

  10. f

    SILAC Peptide Ratio Calculator: A Tool for SILAC Quantitation of Peptides...

    • figshare.com
    txt
    Updated May 30, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Xiaoyan Guan; Neha Rastogi; Mark R. Parthun; Michael A. Freitas (2023). SILAC Peptide Ratio Calculator: A Tool for SILAC Quantitation of Peptides and Post-Translational Modifications [Dataset]. http://doi.org/10.1021/pr400675n.s003
    Explore at:
    txtAvailable download formats
    Dataset updated
    May 30, 2023
    Dataset provided by
    ACS Publications
    Authors
    Xiaoyan Guan; Neha Rastogi; Mark R. Parthun; Michael A. Freitas
    License

    Attribution-NonCommercial 4.0 (CC BY-NC 4.0)https://creativecommons.org/licenses/by-nc/4.0/
    License information was derived automatically

    Description

    This paper describes an algorithm to assist in relative quantitation of peptide post-translational modifications using stable isotope labeling by amino acids in cell culture (SILAC). The described algorithm first determines the normalization factor and then calculates SILAC ratios for a list of target peptide masses using precursor ion abundances. Four yeast histone mutants were used to demonstrate the effectiveness of this approach for quantitation of peptide post-translational modifications changes. The details of the algorithm’s approach for normalization and peptide ratio calculation are described. The examples demonstrate the robustness of the approach as well as its utility to rapidly determine changes in peptide post-translational modifications within a protein.

  11. Data from: Estimating global transpiration from TROPOMI SIF with angular...

    • zenodo.org
    zip
    Updated Sep 15, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Chen Zheng; Chen Zheng (2024). Estimating global transpiration from TROPOMI SIF with angular normalization and separation for sunlit and shaded leaves [Dataset]. http://doi.org/10.5281/zenodo.10674432
    Explore at:
    zipAvailable download formats
    Dataset updated
    Sep 15, 2024
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Chen Zheng; Chen Zheng
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Time period covered
    Jan 1, 2019
    Description

    All three types of SIF-driven T models integrate canopy conductance (gc) with the Penman-Monteith model, differing in how gc is derived: from a SIFobs driven semi-mechanistic equation, a SIFsunlit and SIFshaded driven semi-mechanistic equation, and a SIFsunlit and SIFshaded driven machine learning model.

    BL: SIFobs driven semi-mechanistic model

    TL: SIFsunlit and SIFshaded driven semi-mechanistic model

    hybrid models: SIFsunlit and SIFshaded driven machine learning model.

  12. d

    A program for normalised Morse functions

    • elsevier.digitalcommonsdata.com
    Updated Jan 1, 1972
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    J.R. Parkinson (1972). A program for normalised Morse functions [Dataset]. http://doi.org/10.17632/6bv4ysph7k.1
    Explore at:
    Dataset updated
    Jan 1, 1972
    Authors
    J.R. Parkinson
    License

    https://www.elsevier.com/about/policies/open-access-licenses/elsevier-user-license/cpc-license/https://www.elsevier.com/about/policies/open-access-licenses/elsevier-user-license/cpc-license/

    Description

    Title of program: MORSEFNS Catalogue Id: AAGM_v1_0

    Nature of problem To produce the normalised eigenfunctions of the Schrodinger equation with the Morse potential.

    Versions of this program held in the CPC repository in Mendeley Data AAGM_v1_0; MORSEFNS; 10.1016/0010-4655(72)90017-3

    This program has been imported from the CPC Program Library held at Queen's University Belfast (1969-2019)

  13. t

    (Table 1) Normalized peak intensities from solid phase extracted dissolved...

    • service.tib.eu
    Updated Nov 30, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2024). (Table 1) Normalized peak intensities from solid phase extracted dissolved organic matter from porewater and overlying bottom water collected during Maria S. Merian cruise MSM29 and Polarstern cruise PS85 to the Fram Strait - Vdataset - LDM [Dataset]. https://service.tib.eu/ldmservice/dataset/png-doi-10-1594-pangaea-909107
    Explore at:
    Dataset updated
    Nov 30, 2024
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Area covered
    Fram Strait
    Description

    Dissolved organic matter molecular analyses were performed on a Solarix FT-ICR-MS equipped with a 15 Tesla superconducting magnet (Bruker Daltonic) using a an electrospray ionization source (Bruker Apollo II) in negative ion mode. Molecular formula calculation for all samples was performed using an Matlab (2010) routine that searches, with an error of < 0.5 ppm, for all potential combinations of elements including including the elements C∞, O∞, H∞, N = 4; S = 2 and P = 1. Combination of elements NSP, N2S, N3S, N4S, N2P, N3P, N4P, NS2, N2S2, N3S2, N4S2, S2P was not allowed. Mass peak intensities are normalized relative to the total molecular formulas in each sample according to previously published rules (Rossel et al., 2015; doi:10.1016/j.marchem.2015.07.002). The final data contained 7400 molecular formulae.

  14. m

    Data from: Quantum computation of the Cobb-Douglas utility function via the...

    • data.mendeley.com
    Updated Dec 17, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Isabel Cristina Betancur-Hinestroza (2024). Quantum computation of the Cobb-Douglas utility function via the 2D-Clairaut differential equation [Dataset]. http://doi.org/10.17632/h9ny2yy9hs.3
    Explore at:
    Dataset updated
    Dec 17, 2024
    Authors
    Isabel Cristina Betancur-Hinestroza
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This paper introduces the integration of the Cobb-Douglas (CD) utility model with quantum computation, utilizing the Clairaut-type differential formula. We propose a novel economic-physical model using envelope theory to establish a direct link with quantum entanglement, thereby defining emergent probabilities in the optimal utility function for two goods within a given expenditure limit. The study illuminates the interaction between the CD model and quantum computation, emphasizing the elucidation of system entropy and the role of Clairaut differential equations in understanding the utility's optimal envelopes and intercepts. Innovative algorithms utilizing the 2D-Clairaut differential equation are introduced for the quantum formulation of the CD function, showcasing accurate representation in quantum circuits for one and two qubits. Our empirical findings, validated through IBM-Q computer simulations, align with analytical predictions, demonstrating the robustness of our approach. This methodology articulates the utility-budget relationship within the CD function through a clear model based on envelope representation and canonical line equations, where normalized intercepts signify probabilities. The efficiency and precision of our results, especially in modeling one- and two-qubit quantum entanglement within econometrics, surpass those of IBM-Q simulations, which require extensive iterations for comparable accuracy. This underscores our method's effectiveness in merging economic models with quantum computation.

  15. o

    Calcification Dissolution Potential Tool for Excel: Version 1

    • explore.openaire.eu
    Updated Sep 5, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Travis A Courtney; Andreas J Andersson (2022). Calcification Dissolution Potential Tool for Excel: Version 1 [Dataset]. http://doi.org/10.5281/zenodo.7051627
    Explore at:
    Dataset updated
    Sep 5, 2022
    Authors
    Travis A Courtney; Andreas J Andersson
    Description

    The following data entry sheets are designed to quantify salinity normalized seawater total alkalinity anomalies (∆nTA) from inputs of offshore and coral reef total alkalinity (TA) and salinity (S) data while taking into account the various sources of uncertainty associated with these data normalizations and calculations to estimate the CDP for each reef observation (for details see Courtney et al., 2021). Only cells blocked in white should be modified on the "Data Entry" sheet and all cells blocked in gray are locked to protect the formulas from being modfied. Data for at least one offshore TA and S sample and one coral reef TA and S sample must be entered to display the ∆nTA and CDP for the given reef system. The equations herein will average all offshore TA and S data to calculate the ∆nTA to leverage all possible data. Additionally, the spreadsheets allow for the reference S to be set to the mean offshore or mean coral reef S and are calculated for a range of freshwater TA endmembers, including the option for a user defined value. ∆nTA is calculated as per the following equations from Courtney et al (2021). The CDP summary page also provides a number of summary graphs to visualize (1) whether there are apparent relationships between coral reef TA and S, (2) how the ∆nTA of the inputted data compares to global coral reef ∆TA data from Cyronak et al. (2018), (3) how the ∆nTA data varies spatially across the reef locations, and (4) how well the ∆nTA data covers a complete diel cycle. For further details on the uncertainties associated with the salinity normalization of coral reef data and relevant equations, please see the following publication: Courtney TA, Cyronak T, Griffin AJ, Andersson AJ (2021) Implications of salinity normalization of seawater total alkalinity in coral reef metabolism studies. PLOS One 16(12): e0261210. https://doi.org/10.1371/journal.pone.0261210 Please cite as: Courtney TA & Andersson AJ (2022) Calcification Dissolution Potential Tool for Excel: Version 1. https://doi.org/10.5281/zenodo.7051628

  16. d

    Trace element geochemistry of zircons frim in situ ocean lithoshere

    • search.dataone.org
    • doi.pangaea.de
    Updated Jan 6, 2018
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Grimes, Craig B; John, Barbara E; Cheadle, Michael J; Mazdab, Frank K; Wooden, Joseph L; Swapp, Susan; Schwartz, Joshua J (2018). Trace element geochemistry of zircons frim in situ ocean lithoshere [Dataset]. http://doi.org/10.1594/PANGAEA.772872
    Explore at:
    Dataset updated
    Jan 6, 2018
    Dataset provided by
    PANGAEA Data Publisher for Earth and Environmental Science
    Authors
    Grimes, Craig B; John, Barbara E; Cheadle, Michael J; Mazdab, Frank K; Wooden, Joseph L; Swapp, Susan; Schwartz, Joshua J
    Time period covered
    Dec 6, 1987 - Dec 7, 2004
    Area covered
    Description

    We characterize the textural and geochemical features of ocean crustal zircon recovered from plagiogranite, evolved gabbro, and metamorphosed ultramafic host-rocks collected along present-day slow and ultraslow spreading mid-ocean ridges (MORs). The geochemistry of 267 zircon grains was measured by sensitive high-resolution ion microprobe-reverse geometry at the USGS-Stanford Ion Microprobe facility. Three types of zircon are recognized based on texture and geochemistry. Most ocean crustal zircons resemble young magmatic zircon from other crustal settings, occurring as pristine, colorless euhedral (Type 1) or subhedral to anhedral (Type 2) grains. In these grains, Hf and most trace elements vary systematically with Ti, typically becoming enriched with falling Ti-in-zircon temperature. Ti-in-zircon temperatures range from 1,040 to 660°C (corrected for a TiO2 ~ 0.7, a SiO2 ~ 1.0, pressure ~ 2 kbar); intra-sample variation is typically ~60-15°C. Decreasing Ti correlates with enrichment in Hf to ~2 wt%, while additional Hf-enrichment occurs at relatively constant temperature. Trends between Ti and U, Y, REE, and Eu/Eu* exhibit a similar inflection, which may denote the onset of eutectic crystallization; the inflection is well-defined by zircons from plagiogranite and implies solidus temperatures of ~680-740°C. A third type of zircon is defined as being porous and colored with chaotic CL zoning, and occurs in ~25% of rock samples studied. These features, along with high measured La, Cl, S, Ca, and Fe, and low (Sm/La)N ratios are suggestive of interaction with aqueous fluids. Non-porous, luminescent CL overgrowth rims on porous grains record uniform temperatures averaging 615 ± 26°C (2SD, n = 7), implying zircon formation below the wet-granite solidus and under water-saturated conditions. Zircon geochemistry reflects, in part, source region; elevated HREE coupled with low U concentrations allow effective discrimination of ~80% of zircon formed at modern MORs from zircon in continental crust. The geochemistry and textural observations reported here serve as an important database for comparison with detrital, xenocrystic, and metamorphosed mafic rock-hosted zircon populations to evaluate provenance.

  17. p

    Beach Litter - Composition of litter according to material categories in...

    • pigma.org
    • catalogue.arctic-sdi.org
    • +1more
    doi, www:download +1
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    National Institute of Oceanography and Applied Geophysics - OGS, Division of Oceanography, Beach Litter - Composition of litter according to material categories in percent normalized per beach per year - Other sources 2001/2020 v2021 [Dataset]. https://www.pigma.org/geonetwork/BORDEAUX_METROPOLE_DIR_INFO_GEO/api/records/2619e339-4f1b-4e58-81bf-28f1411da968
    Explore at:
    www:download, doi, www:linkAvailable download formats
    Dataset provided by
    IFREMER, SISMER, Scientific Information Systems for the SEA
    EMODnet Chemistry
    National Institute of Oceanography and Applied Geophysics - OGS, Division of Oceanography
    Ifremer, VIGIES (Information Valuation Service for Integrated Management and Monitoring)
    Time period covered
    Jan 1, 2001 - Apr 22, 2020
    Area covered
    Description

    This visualization product displays marine macro-litter (> 2.5cm) material categories percentage per beach per year from non-MSFD monitoring surveys, research & cleaning operations.

    EMODnet Chemistry included the collection of marine litter in its 3rd phase. Since the beginning of 2018, data of beach litter have been gathered and processed in the EMODnet Chemistry Marine Litter Database (MLDB). The harmonization of all the data has been the most challenging task considering the heterogeneity of the data sources, sampling protocols and reference lists used on a European scale.

    Preliminary processing were necessary to harmonize all the data: - Exclusion of OSPAR 1000 protocol: in order to follow the approach of OSPAR that it is not including these data anymore in the monitoring; - Selection of surveys from non-MSFD monitoring, cleaning and research operations; - Exclusion of beaches without coordinates; - Exclusion of surveys without associated length; - Some litter types like organic litter, small fragments (paraffin and wax; items > 2.5cm) and pollutants have been removed. The list of selected items is attached to this metadata. This list was created using EU Marine Beach Litter Baselines and EU Threshold Value for Macro Litter on Coastlines from JRC (these two documents are attached to this metadata); - Exclusion of the "feaces" category: it concerns more exactly the items of dog excrements in bags of the OSPAR (item code: 121) and ITA (item code: IT59) reference lists; - Normalization of survey lengths to 100m & 1 survey / year: in some case, the survey length was not 100m, so in order to be able to compare the abundance of litter from different beaches a normalization is applied using this formula: Number of items (normalized by 100 m) = Number of litter per items x (100 / survey length) Then, this normalized number of items is summed to obtain the total normalized number of litter for each survey.

    To calculate percentages for each material category, formula applied is: Material (%) = (∑number of items (normalized at 100 m) of each material category)*100 / (∑number of items (normalized at 100 m) of all categories)

    The material categories differ between reference lists (OSPAR, TSG_ML, UNEP, UNEP_MARLIN). In order to apply a common procedure for all the surveys, the material categories have been harmonized.

    More information is available in the attached documents.

    Warning: the absence of data on the map doesn't necessarily mean that they don't exist, but that no information has been entered in the Marine Litter Database for this area.

  18. o

    Data from: A pilot study to determine the timing and effect of bevacizumab...

    • odportal.tw
    Updated Jul 14, 2016
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2016). A pilot study to determine the timing and effect of bevacizumab on vascular normalization of metastatic brain tumors in breast cancer. [Dataset]. https://odportal.tw/dataset/2Xcjt7Um
    Explore at:
    Dataset updated
    Jul 14, 2016
    License

    https://data.gov.tw/licensehttps://data.gov.tw/license

    Description

    Background To determine the appropriate time of concomitant chemotherapy administration after antiangiogenic treatment, we investigated the timing and effect of bevacizumab administration on vascular normalization of metastatic brain tumors in breast cancer patients. Methods Eight patients who participated in a phase II trial for breast cancer-induced refractory brain metastases were enrolled and subjected to 4 dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) examinations that evaluated Peak, Slope, iAUC60, and Ktrans before and after treatment. The treatment comprised bevacizumab on Day 1, etoposide on Days 2–4, and cisplatin on Day 2 in a 21-day cycle for a maximum of 6 cycles. DCE-MRI was performed before treatment and at 1 h, 24 h, and 21 days after bevacizumab administration. Results Values of the 4 DCE-MRI parameters reduced after bevacizumab administration. Compared with baseline values, the mean reductions at 1 and 24 h were ?12.8 and ?24.7 % for Peak, ?46.6 and ?65.8 % for Slope, ?27.9 and ?55.5 % for iAUC60, and ?46.6 and ?63.9 % for Ktrans, respectively (all P < .05). The differences in the 1 and 24 h mean reductions were significant (all P < .05) for all the parameters. The generalized estimating equation linear regression analyses of the 4 DCE-MRI parameters revealed that vascular normalization peaked 24 h after bevacizumab administration. Conclusion Bevacizumab induced vascular normalization of brain metastases in humans at 1 and 24 h after administration, and the effect was significantly higher at 24 h than at 1 h.

  19. f

    Table S1 - Equations for Lipid Normalization of Carbon Stable Isotope Ratios...

    • figshare.com
    • plos.figshare.com
    xlsx
    Updated Jun 6, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Kyle H. Elliott; Mikaela Davis; John E. Elliott (2023). Table S1 - Equations for Lipid Normalization of Carbon Stable Isotope Ratios in Aquatic Bird Eggs [Dataset]. http://doi.org/10.1371/journal.pone.0083597.s001
    Explore at:
    xlsxAvailable download formats
    Dataset updated
    Jun 6, 2023
    Dataset provided by
    PLOS ONE
    Authors
    Kyle H. Elliott; Mikaela Davis; John E. Elliott
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Stable isotope data analyzed in the manuscript. (XLSX)

  20. f

    Diffusion FRAP models corrected for photofading and immobile fraction and...

    • plos.figshare.com
    xls
    Updated Jun 2, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Minchul Kang; Manuel Andreani; Anne K. Kenworthy (2023). Diffusion FRAP models corrected for photofading and immobile fraction and their compatibilities. [Dataset]. http://doi.org/10.1371/journal.pone.0127966.t002
    Explore at:
    xlsAvailable download formats
    Dataset updated
    Jun 2, 2023
    Dataset provided by
    PLOS ONE
    Authors
    Minchul Kang; Manuel Andreani; Anne K. Kenworthy
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    In Binding Diffusion* (14), ck represents the concentration of ligand receptor complex in the kth image, ckM is the mobile fraction of c, β is the immobile fraction, and τ is the time needed for each image scan. Finally, g1 and g2 are the bleaching functions for bounded (c) and free (u) proteins.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Gelio Alves; Yi-Kuo Yu (2023). Breakdown of Methods Used to Combine -values Investigated. [Dataset]. http://doi.org/10.1371/journal.pone.0091225.t001

Breakdown of Methods Used to Combine -values Investigated.

Related Article
Explore at:
xlsAvailable download formats
Dataset updated
Jun 5, 2023
Dataset provided by
PLOS ONE
Authors
Gelio Alves; Yi-Kuo Yu
License

Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically

Description

The first column of the table provides the names of the methods used to combine -values investigated in our study. The second column lists the reference number cited in this paper for the publication (Ref) corresponding to the method used. The third column provides the equation number for the method distribution function used to compute the formula -value. The fourth column indicates if a method equation can accommodate (acc.) weight when combining -value. The fifth column gives the normalization (nor.) procedure used to normalize the weights. Finally, the last column conveys the information about a method's capability to account for correlation (corr.) between -values.

Search
Clear search
Close search
Google apps
Main menu