Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
In our everyday lives, we are required to make decisions based upon our statistical intuitions. Often, these involve the comparison of two groups, such as luxury versus family cars and their suitability. Research has shown that the mean difference affects judgements where two sets of data are compared, but the variability of the data has only a minor influence, if any at all. However, prior research has tended to present raw data as simple lists of values. Here, we investigated whether displaying data visually, in the form of parallel dot plots, would lead viewers to incorporate variability information. In Experiment 1, we asked a large sample of people to compare two fictional groups (children who drank ‘Brain Juice’ versus water) in a one-shot design, where only a single comparison was made. Our results confirmed that only the mean difference between the groups predicted subsequent judgements of how much they differed, in line with previous work using lists of numbers. In Experiment 2, we asked each participant to make multiple comparisons, with both the mean difference and the pooled standard deviation varying across data sets they were shown. Here, we found that both sources of information were correctly incorporated when making responses. Taken together, we suggest that increasing the salience of variability information, through manipulating this factor across items seen, encourages viewers to consider this in their judgements. Such findings may have useful applications for best practices when teaching difficult concepts like sampling variation.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Outliers correspond to fixes with location error (LE)>3 standard deviations from the mean location error of all fixes in the same habitat (i.e., without regard to the visibility category). The last two columns report on the mean number of outliers ± standard deviation across each visibility, and LERMS values calculated from all fixes in the same habitat after removal of outlier values.
Facebook
TwitterCC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
Meta-analysis, which drives evidence-based practice, typically focuses on the average response of subjects to a treatment. For instance in nutritional research the difference in average weight of participants on different diets is typically used to draw conclusions about the relative efficacy of interventions. As a result of their focus on the mean, meta-analyses largely overlook the effects of treatments on inter-subject variability. Recent tools from the study of biological evolution, where inter-individual variability is one of the key ingredients for evolution by natural selection, now allow us to study inter-subject variability using established meta-analytic models. Here we use meta-analysis to study how low carbohydrate (LC) ad libitum diets and calorie restricted diets affect variance in mass. We find that LC ad libitum diets may have a more variable outcome than diets that prescribe a reduced calorie intake. Our results suggest that whilst LC diets are effective in a large proportion of the population, for a subset of individuals, calorie restricted diets may be more effective. There is evidence that LC ad libitum diets rely on appetite suppression to drive weight loss. Extending this hypothesis, we suggest that between-individual variability in protein appetite may drive the trends that we report. A priori identification of an individual's target intake for protein may help define the most effective dietary intervention to prescribe for weight loss.
Facebook
TwitterSome of the SNK rasters intentionally do not align or have the same extent. These rasters were not snapped to a common raster per the authors discretion. Please review selected rasters prior to use. These varying alignments are a result of the use of differing source data sets and all products derived from them. We recommend that users snap or align rasters as best suits their own projects. - The first set of files represents projections of the number of historical (1901-1981) standard deviations (SD) above the historical mean for each of three future decades (2020-2029, 2050-2059, 2060-2069) temperature and precipitation levels.
The second set of files represents projections of the proportion of years in a future decade when monthly temperature or precipitation levels are at least two historical SDs above the historical mean.
Temperature and precipitation are monthly means and totals, respectively.
The spatial extent is clipped to a Seward REA boundary bounding box.
In the first set of files, each file, referred to as SDclasses, consists of ordered categorical (factor) data, with three unique classes (factor levels), coded 0, 1 and 2. Within each file, raster grid cells categorized as 0 represent those where the future decadal mean temperature or precipitation value does not exceed one historical SD above the historical mean. Cells categorized as 1 represent those where future decadal values exceed the historical mean by at least one but less than two historical SDs. Cells categorized as 2 represent those where future decadal values exceed the historical mean by at least two historical SDs.
In the second set of files, each file, referred to as annProp, consists of numerical data. Within each file, raster grid cell values are proportions, ranging from zero to one, representing the proportion of years in a future decade when monthly mean temperature or monthly total precipitation are at least two historical SD above the historical mean. Proportions are calculated on five GCMs and then averaged rather than calculated on the five-model composite directly.
Overview:
The historical monthly mean is calculated for each month as the 1901-1981 interannual mean, i.e., the mean of 82 annual monthly values.
The historical SD is calculated for each month as the 1901-1981 interannual SD, i.e., the SD of 82 annual monthly values.
2x2 km spatial resolution downscaled CRU 3.1 data is used as the historical baseline.
A five-model composite (average) of the Alaska top five AR4 2x2 km spatial resolution downscaled global circulation models (GCMs), using the A2 emissions scenario, is used for the future decadal datasets. This 5 Model Average is referred to by the acronym 5modelavg.
For a description of the model selection process, please see Walsh et al. 2008. Global Climate Model Performance over Alaska and Greenland. Journal of Climate. v. 21 pp. 6156-6174.
Emmission scenarios in brief:
The Intergovernmental Panel on Climate Change (IPCC) Special Report on Emissions Scenarios (SRES) created a range of scenarios to explore alternative development pathways, covering a wide range of demographic, economic and technological driving forces and resulting greenhouse gas emissions. The B1 scenario describes a convergent world, a global population that peaks in mid-century, with rapid changes in economic structures toward a service and information economy. The Scenario A1B assumes a world of very rapid economic growth, a global population that peaks in mid-century, rapid introduction of new and more efficient technologies, and a balance between fossil fuels and other energy sources. The A2 scenario describes a very heterogeneous world with high population growth, slow economic development and slow technological change.
These files are bias corrected and downscaled via the delta method using PRISM (http:prism.oregonstate.edu) 1961-1990 2km data as baseline climate. Absolute anomalies are utilized for temperature variables. Proportional anomalies are utilized for precipitation variables. Please see http:www.snap.uaf.edumethods.php for a description of the downscaling process.
File naming scheme:
[variable]_[metric]_[groupModel]_[timeFrame].[fileFormat]
[variable] pr, tas [metric] SDclasses, annProp [groupModel] 5modelAvg [timeFrame] decade_month [fileFormat] tif
examples:
pr_SDclasses_5modelAvg_2020s_01.tif
This file represents a spatially explicit map of the number of January total precipitation historical SDs above the January total precipitation historical mean level, for projected 2020-2029 decadal mean January total precipitation, where cell values are binned in classes less than one, at least one, less than two, and greater than two, labeled as 0, 1, and 2, respectively.
tas_annProp_5modelAVg_2060s_06.tif
This file represents a spatially explicit map of the proportion of years in the period 2060-2069 when June mean temperature projections are at least two historical SDs above the June mean temperature historical mean level, where cell values are proportions ranging from zero to one.
tas = near-surface air temperature
pr = precipitation including both liquid and solid phases
Facebook
TwitterStd dev = standard deviation, sig = level of significance.Comparison between the two readers regarding CSA and TDPF.
Facebook
TwitterRecent rapid climate warming at the western Antarctic Peninsula (WAP) results in elevated glacial melting, enhanced sedimentary run-off, increased turbidity and impact of ice-scouring in shallow coastal areas. Discharge of mineral suspension from volcanic bedrock ablation and chronic physical disturbance is expected to influence sessile filter feeders such as the Antarctic soft shell clam Laternula elliptica ( King and Broderip, 1832). We investigated effects of sedimentary run-off on the accumulation of trace metals, and together with physical disturbance, the cumulative effect on oxidative stress parameters in younger and older L. elliptica from two stations in Potter Cove (King George Island, Antarctica) which are distinctly impacted by turbidity and ice-scouring. Fe, Mn, Sr, V and Zn concentrations were slightly higher in sediments of the station receiving more sediment run-off, but not enriched in bivalves of this station. The only element that increased in bivalves experimentally exposed to sediment suspension for 28 days was Mn. Concentration of the waste accumulation biomarker lipofuscin in nervous tissue was higher in L. elliptica from the "exposed" compared to the "less exposed" site, whereas protein carbonyl levels in bivalve mantle tissue were higher at the less sediment impacted site. Tissue metal content and lipofuscin in nervous tissue were generally higher in older compared to younger individuals from both field stations. We conclude that elevated sediment ablation does not per se result in higher metal accumulation in L. elliptica. Instead of direct absorbance from sediment particles, metal accumulation in gills seems to indicate uptake of compounds dissolved in the water column, whereas metals in digestive gland appear to originate from enriched planktonic or detritic food. Accumulation of cellular waste products and potentially reactive metals over lifetime presumably alters L. elliptica physiological performance with age and may contribute to higher stress susceptibility in older animals.
Facebook
TwitterExcess Thorium-230 (230Thxs) as a constant flux tracer is an essential tool for paleoceanographic studies, but its limitations for flux normalization are still a matter of debate. In regions of rapid sediment accumulation, it has been an open question if 230Thxs-normalized fluxes are biased by particle sorting effects during sediment redistribution. In order to study the sorting effect of sediment transport on 230Thxs, we analyzed the specific activity of 230Thxs in different particle size classes of carbonate-rich sediments from the South East Atlantic, and of opal-rich sediments from the Atlantic sector of the Southern Ocean. At both sites, we compare the 230Thxs distribution in neighboring high vs. low accumulation settings. Two grain-size fractionation methods are explored.
We find that the 230Thxs distribution is strongly grain size dependent, and 50-90% of the total 230Thxs inventory is concentrated in fine material smaller than 10 µm, which is preferentially deposited at the high accumulation sites. This leads to an overestimation of the focusing factor Psi, and consequently to an underestimation of the vertical flux rate at such sites. The distribution of authigenic uranium indicates that fine organic-rich material has also been re-deposited from lateral sources. If the particle sorting effect is considered in the flux calculations, it reduces the estimated extent of sediment focusing. In order to assess the maximum effect of particle sorting on Psi, we present an extreme scenario, in which we assume a lateral sediment supply of only fine material (< 10 µm). In this case, the focusing factor of the opal-rich core would be reduced from Psi = 5.9 to Psi = 3.2. In a more likely scenario, allowing silt-sized material to be transported, Psi is reduced from 5.9 to 5.0 if particle sorting is taken into consideration. The bias introduced by particle sorting is most important for strongly focused sediments.
Comparing 230Thxs-normalized mass fluxes biased by sorting effects with uncorrected mass fluxes, we suggest that 230Thxs-normalization is still a valid tool to correct for lateral sediment redistribution. However, differences in focusing factors between core locations have to be evaluated carefully, taking the grain size distributions into consideration.
Facebook
TwitterAttribution-NonCommercial 4.0 (CC BY-NC 4.0)https://creativecommons.org/licenses/by-nc/4.0/
License information was derived automatically
This laboratory experiment was devoted to teaching descriptive statistics and comparing independent groups to STEAM (Science, Technology, Engineering, Arts, and Mathematics) students using open-source and graphical user interface software. Students answered 21 questions using JAMOVI in previously published data sets to learn fundamental statistics concepts. It was divided into four parts. In the first part, descriptive statistics were carried out (mean, median, standard deviation, interquartile range, data normality, and skewness). In the second part, data normality was checked by using visual inspection of plots (histograms and Q–Q plots). In the third part, two independent groups were compared. In the fourth part, more than two independent groups were compared. Normally, comparisons between two or more groups are presented in many textbooks, and a normal and homogeneous distribution of the data is assumed. Only parametric tests were taught, while nonparametric tests were not presented. Thus, data normality was checked using hypothesis tests (Shapiro–Wilk, Kolmogorov–Smirnov, and Anderson–Darling tests). Then, homogeneity was checked using Levene’s and Bartlett’s tests. Normality and homogeneity were also checked using a visual inspection of plots. Once normality and homogeneity were checked, parametric tests were used (t test and ANOVA). If the normality of the data was not checked, nonparametric tests were used (Mann–Whitney and Kruskal–Wallis tests).
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This dataset contains the prioritization provided by a panel of 15 experts to a set of 28 barriers categories for 8 different roles of the future energy system. A Delphi method was followed and the scores provided in the three rounds carried out are included. The dataset also contains the scripts used to assess the results and the output of this assessment. A list of the information contained in this file is: data folder: this folders includes the scores given by the 15 experts in the 3 rounds. Every round is in an individual folder. There is a file per expert that has the scores between -5 (not relevant at all) to 5 (completely relevant) per barrier (rows) and actor (columns). There is also a file with the description of the experts in terms of their position in the company, the type of company and the country. fig folder: this folder includes the figures created to assess the information provided by the experts. For each round, the following figures are created (in each respective folder): Boxplot with the distribution of scores per barriers and roles. Heatmap with the mean scores per barriers and roles. Boxplots with the comparison of the different distributions provided by the experts of each group (depending on the keywords) per barrier and role. Heatmap with the mean score per barrier and use case and with the prioritization per barrier and use case. Finally, bar plots with the mean scores differences between rounds and boxplot with comparisons of the scores distributions are also provided. stat folder: this folder includes the files with the results of the different statistical assessment carried out. For each round, the following figures are created (in each respective folder): The statistics used to assess the scores (Intraclass correlation coefficient, Inter-rater agreement, Inter-rater agreement p-value, Homogeneity of Variances, Average interquartile range, Standard Deviation of interquartile ranges, Friedman test p-value Average power post hoc) per barrier and per role. The results of the post hoc of the Friedman Test per berries and per roles. The average score per barrier and per role. The mean value of the scores provided by the experts grouped by the keywords per barrier and role. P-value of the comparison of these two values. The end prioritization of the barrier for the use case (averaging the scores or merging the critical sets) Finally, the differences between the mean and standard deviations of the scores between two consecutive rounds are provided.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This is the Baltic and North Sea Climatology (BNSC) for the Baltic Sea and the North Sea in the range 47 ° N to 66 ° N and 15 ° W to 30 ° E. It is the follow-up project to the KNSC climatology. The climatology was first made available to the public in March 2018 by ICDC and is published here in a slightly revised version 2. It contains the monthly averages of mean air pressure at sea level, and air temperature, and dew point temperature at 2 meter height. It is available on a 1 ° x 1 ° grid for the period from 1950 to 2015. For the calculation of the mean values, all available quality-controlled data of the DWD (German Meteorological Service) of ship observations and buoy measurements were taken into account during this period. Additional dew point values were calculated from relative humidity and air temperature if available. Climatologies were calculated for the WMO standard periods 1951-1980, 1961-1990, 1971-2000 and 1981-2010 (monthly mean values). As a prerequisite for the calculation of the 30-year-climatology, at least 25 out of 30 (five-sixths) valid monthly means had to be present in the respective grid box. For the long-term climatology from 1950 to 2015, at least four-fifths valid monthly means had to be available. Two methods were used (in combination) to calculate the monthly averages, to account for the small number of measurements per grid box and their uneven spatial and temporal distribution: 1. For parameters with a detectable annual cycle in the data (air temperature, dew point temperature), a 2nd order polynomial was fitted to the data to reduce the variation within a month and reduce the uncertainty of the calculated averages. In addition, for the mean value of air temperature, the daily temperature cycle was removed from the data. In the case of air pressure, which has no annual cycle, in version 2 per month and grid box no data gaps longer than 14 days were allowed for the calculation of a monthly mean and standard deviation. This method differs from KNSC and BNSC version 1, where mean and standard deviation were calculated from 6-day windows means. 2. If the number of observations fell below a certain threshold, which was 20 observations per grid box and month for the air temperature as well as for the dew point temperature, and 500 per box and month for the air pressure, data from the adjacent boxes was used for the calculation. The neighbouring boxes were used in two steps (the nearest 8 boxes, and if the number was still below the threshold, the next sourrounding 16 boxes) to calculate the mean value of the center box. Thus, the spatial resolution of the parameters is reduced at certain points and, instead of 1 ° x 1 °, if neighboring values are taken into account, data from an area of 5 ° x 5 ° can also be considered, which are then averaged into a grid box value. This was especially used for air pressure, where the 24 values of the neighboring boxes were included in the averaging for most grid boxes. The mean value, the number of measurements, the standard deviation and the number of grid boxes used to calculate the mean values are available as parameters in the products. The calculated monthly and annual means were allocated to the centers of the grid boxes: Latitudes: 47.5, 48.5, ... Longitudes: -14.5, -13.5, … In order to remove any existing values over land, a land-sea mask was used, which is also provided in 1 ° x 1 ° resolution. In this version 2 of the BNSC, a slightly different database was used, than for the KNSC, which resulted in small changes (less than 1 K) in the means and standard deviations of the 2-meter air temperature and dew point temperature. The changes in mean sea level pressure values and the associated standard deviations are in the range of a few hPa, compared to the KNSC. The parameter names and units have been adjusted to meet the CF 1.6 standard.
Facebook
TwitterThe chemistry of snow and ice cores from Svalbard is influenced by variations in local sea ice margin and distance to open water. Snow pits sampled at two summits of Vestfonna ice cap (Nordaustlandet, Svalbard), exhibit spatially heterogeneous soluble ions concentrations despite similar accumulation rates, reflecting the importance of small-scale weather patterns on this island ice cap. The snow pack on the western summit shows higher average values of marine ions and a winter snow layer that is relatively depleted in sulphate. One part of the winter snow pack exhibits a [SO4-/Na+] ratio reduced by two thirds compared with its ratio in sea water. This low sulphate content in winter snow is interpreted as the signature of frost flowers, which are formed on young sea ice when offshore winds predominate. Frost flowers have been described as the dominant source of sea salt to aerosol and precipitation in ice cores in coastal Antarctica but this is the first time their chemical signal has been described in the Arctic. The eastern summit does not show any frost flower signature and we interpret the unusually dynamic ice transport and rapid formation of thin ice on the Hinlopen Strait as the source of the frost flowers.
Facebook
TwitterThe possibility to use colour data, a fast and inexpensive method of proxy data generation, extracted from two selected loess-paleosol sequences is discussed here. We compare the outcome from analysing outcrop images taking by digital cameras in the field and spectral colour data as determined under controlled laboratory conditions. By nature, differences can be expected due to differences in illumination, moisture, and sample preparation. Outcrop inclination may be an issue for photographs; correcting for this is possible when marks can be used for rectification. In both cases the data extracted from images match the visual impression of photos well, and are useful for obtaining a more quantitative measure for field observations. Smoothness (as measured by autocorrelation) is high for an image from Achenheim/France, where an image with a width of ca. 1.1 m and a depth of 1.6 m was analysed. Data from a narrower image part from Sanovita/Romania are noisier. In both example cases, a significant correlation between data extracted by digital image analysis and laboratory measurements could be established, suggesting that image analysis may be a useful tool where outcrop- and light-conditions allow useful photographs, especially where high resolution proxy data is required.
Facebook
Twitterhttps://www.nist.gov/open/licensehttps://www.nist.gov/open/license
These four data files contain datasets from an interlaboratory comparison that characterized a polydisperse five-population bead dispersion in water. A more detailed version of this description is available in the ReadMe file (PdP-ILC_datasets_ReadMe_v1.txt), which also includes definitions of abbreviations used in the data files. Paired samples were evaluated, so the datasets are organized as pairs associated with a randomly assigned laboratory number. The datasets are organized in the files by instrument type: PTA (particle tracking analysis), RMM (resonant mass measurement), ESZ (electrical sensing zone), and OTH (other techniques not covered in the three largest groups, including holographic particle characterization, laser diffraction, flow imaging, and flow cytometry). In the OTH group, the specific instrument type for each dataset is noted. Each instrument type (PTA, RMM, ESZ, OTH) has a dedicated file. Included in the data files for each dataset are: (1) the cumulative particle number concentration (PNC, (1/mL)); (2) the concentration distribution density (CDD, (1/mL·nm)) based upon five bins centered at each particle population peak diameter; (3) the CDD in higher resolution, varied-width bins. The lower-diameter bin edge (µm) is given for (2) and (3). Additionally, the PTA, RMM, and ESZ files each contain unweighted mean cumulative particle number concentrations and concentration distribution densities calculated from all datasets reporting values. The associated standard deviations and standard errors of the mean are also given. In the OTH file, the means and standard deviations were calculated using only data from one of the sub-groups (holographic particle characterization) that had n = 3 paired datasets. Where necessary, datasets not using the common bin resolutions are noted (PTA, OTH groups). The data contained here are presented and discussed in a manuscript to be submitted to the Journal of Pharmaceutical Sciences and presented as part of that scientific record.
Facebook
TwitterMesoscale eddies play a major role in controlling ocean biogeochemistry. By impacting nutrient availability and water column ventilation, they are of critical importance for oceanic primary production. In the eastern tropical South Pacific Ocean off Peru, where a large and persistent oxygen-deficient zone is present, mesoscale processes have been reported to occur frequently. However, investigations into their biological activity are mostly based on model simulations, and direct measurements of carbon and dinitrogen (N2) fixation are scarce. We examined an open-ocean cyclonic eddy and two anticyclonic mode water eddies: a coastal one and an open-ocean one in the waters off Peru along a section at 16°S in austral summer 2012. Molecular data and bioassay incubations point towards a difference between the active diazotrophic communities present in the cyclonic eddy and the anticyclonic mode water eddies. In the cyclonic eddy, highest rates of N2 fixation were measured in surface waters but no N2 fixation signal was detected at intermediate water depths. In contrast, both anticyclonic mode water eddies showed pronounced maxima in N2 fixation below the euphotic zone as evidenced by rate measurements and geochemical data. N2 fixation and carbon (C) fixation were higher in the young coastal mode water eddy compared to the older offshore mode water eddy. A co-occurrence between N2 fixation and biogenic N2, an indicator for N loss, indicated a link between N loss and N2 fixation in the mode water eddies, which was not observed for the cyclonic eddy. The comparison of two consecutive surveys of the coastal mode water eddy in November 2012 and December 2012 also revealed a reduction in N2 and C fixation at intermediate depths along with a reduction in chlorophyll by half, mirroring an aging effect in this eddy. Our data indicate an important role for anticyclonic mode water eddies in stimulating N2 fixation and thus supplying N offshore.
Facebook
TwitterNote: Empty cells mean no such genotypes were found in our sample. Maj: Major allele; Het: Heterozygote; Min: Minor allele.aResults (p values) of post hoc comparisons. mh = Maj versus Het, mm = Maj versus Min, hm = Het versus Min.bPost hoc comparison was not run because there were only 2 groups for this locus.
Facebook
Twitter*Data expressed as mean ± standard deviation.† Mann-Whitney testComparison of phenotypic data between the two clusters of sequences defined according to the ID1 DSM.
Facebook
TwitterA controversy currently exists regarding the number of Toba eruptive events represented in the tephra occurrences across peninsular India. Some claim the presence of a single bed, the 75,000-yr-old Toba tephra; others argue that dating and archaeological evidence suggest the presence of earlier Toba tephra. Resolution of this issue was sought through detailed geochemical analyses of a comprehensive suite of samples, allowing comparison of the Indian samples to those from the Toba caldera in northern Sumatra, Malaysia, and, importantly, the sedimentary core at ODP Site 758 in the Indian Ocean - a core that contains several of the earlier Toba tephra beds. In addition, two samples of Toba tephra from western India were dated by the fission-track method. The results unequivocally demonstrate that all the presently known Toba tephra occurrences in peninsular India belong to the 75,000 yr B.P. Toba eruption. Hence, this tephra bed can be used as an effective tool in the correlation and dating of late Quaternary sedimentary sequences across India and it can no longer be used in support of a middle Pleistocene age for associated Acheulian artifacts.
Facebook
TwitterFour models of fission track annealing in apatite are compared with measured fission track lengths in samples from Site 800 in the East Mariana Basin, Ocean Drilling Program Leg 129, given an independently determined temperature history. The temperature history of Site 800 was calculated using a one-dimensional, compactive, conductive heat flow model assuming two end-member thermal cases: one for cooling of Jurassic ocean crust that has experienced no subsequent heating, and one for cooling of Cretaceous ocean crust. Because the samples analyzed were only shallowly buried and because the tectonic history of the area since sample deposition is simple, resolution of the temperature history is high. The maximum temperature experienced by the sampled bed is between 16°-21°C and occurs at 96 Ma; temperatures since the Cretaceous have dropped in spite of continued pelagic sediment deposition because heat flow has continued to decay exponentially and bottom-water temperatures have dropped. Fission tracks observed within apatite grains from the sampled bed are 14.6 +/- 0.1 µm (1 sigma) long. Given the proposed temperature history of the samples, one unpublished and three published models of fission track annealing predict mean track lengths from 14.8 to 15.9 µm. These models require temperatures as much as 40°C higher than the calculated paleotemperature maximum of the sampled bed to produce the same degree of track annealing. Measured and predicted values are different because annealing models are based on extrapolation of high temperature laboratory data to geologic times. The model that makes the closest prediction is based on the greatest number of experiments performed at low temperature and on an apatite having composition closest to that of the core samples.
Facebook
TwitterThe aim of this paper is to analyze and compare mineralogy and geochemistry of copper-zinc sulfide ores from the Logachev-2 and Rainbow hydrothermal fields of the Mid-Atlantic Ridge (MAR) confined to serpentinite protrusions. It was found that Zn(Fe) and Cu, Fe(Zn) sulfides had been deposited in black smokers pipes almost simultaneously from intermittently flowing, nonequilibrium H2S-low solutions of different temperatures. Pb isotope composition confirmed that the deep oceanic crust had been a source of lead. The ores from the Rainbow field are 20-fold higher in Co than ores restricted to basalts and show a high ratio of Co/Ni=46. The ores from the Rainbow field are enriched in 34S isotope (aver. d34S=10 per mil) because of constant flow of cold sea water into the subsurface zone of the hydrothermal system. Ores from the Logachev-2 field are 8 times higher in gold compared to other MAR regions. Sulfide ores from the Rainbow and Logachev-2 fields have no analogues among MAR ore occurrences in terms of enrichment in valuable components (Zn, Cd, Co, and Au).
Facebook
TwitterThe topographic survey of the studied outcrops is based on several thousands of measurements per study site and the measurement of the sample elevation with reference to sea level using a real-time kinematic GPS Trimble R8. The maximum vertical (Z) and horizontal (X and Y) elevation errors are of ± 2.0 cm and a few millimetres, respectively. During the measurement, the surveys were related to the French Polynesian Geodetic Network (Réseau Géodésique de Polynésie Française; RGPF), to operating tide gauges or tide gauge data sets, to probes that were deployed during the field work, to the instantaneous sea level or to modern adjacent microatolls growing in a similar environment than their fossil counterparts. In the absence of geodetic datum or tide gauges, probes were deployed for four to five days in order to measure the sea-level position and to compare the data to the elevation of modern microatolls. The relative sea-level curve, which is presented in this paper, is based on data acquired on islands for which longer tidal records and geodetic data are available. After acquisition, the raw data were processed with the aims: 1) to estimate the elevation of individual dated fossil microatolls based on local tide gauge parameters, and 2) to compare the elevation of all dated fossil microatolls according to the same vertical reference. The link between tide gauge data and the position of the living and fossil microatolls can be established using RGPF. However, a topographic reference at the scale of French Polynesia (4,167 km^2), which is mandatory to achieve the second objective, does not exist, as tide gauge observations are incomplete and the NGPF (Nivellement Général de Polynésie Française) vertical datum that is associated to the RGPF is not homogeneous at this regional scale. The official geodetic system in French Polynesia is the RGPF, which is associated with the NGPF vertical datum. The French Polynesian Geodetic Network is a semi-dynamic system with different levels established by the Naval Hydrographic and Oceanographic Service (Service Hydrographique et Océanographique de la Marine; SHOM) in cooperation with the National Geographic Institute (Institut Géographique National; IGN). The selection of microatolls for dating has been based on the lack of erosion features, the absence of local moating effects and their mineralogical preservation, demonstrating that our database is robust. The chemical preparation, mass-spectrometer measurements and age dating were performed in the years 2014 to 2016 mostly directly after field collection. The data are presented in Supplementary Table 2 following recommendations from Dutton et al. (2017). The best-preserved samples, as indicated by X-ray Powder Diffraction (XRD) measurements, comprise 97.5% aragonite on average (n = 281). Additionally, no secondary aragonite or calcite crystals were revealed by thin section and Scanning Electron Microscope (SEM) observations.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
In our everyday lives, we are required to make decisions based upon our statistical intuitions. Often, these involve the comparison of two groups, such as luxury versus family cars and their suitability. Research has shown that the mean difference affects judgements where two sets of data are compared, but the variability of the data has only a minor influence, if any at all. However, prior research has tended to present raw data as simple lists of values. Here, we investigated whether displaying data visually, in the form of parallel dot plots, would lead viewers to incorporate variability information. In Experiment 1, we asked a large sample of people to compare two fictional groups (children who drank ‘Brain Juice’ versus water) in a one-shot design, where only a single comparison was made. Our results confirmed that only the mean difference between the groups predicted subsequent judgements of how much they differed, in line with previous work using lists of numbers. In Experiment 2, we asked each participant to make multiple comparisons, with both the mean difference and the pooled standard deviation varying across data sets they were shown. Here, we found that both sources of information were correctly incorporated when making responses. Taken together, we suggest that increasing the salience of variability information, through manipulating this factor across items seen, encourages viewers to consider this in their judgements. Such findings may have useful applications for best practices when teaching difficult concepts like sampling variation.