64 datasets found
  1. Plot Data- figure 4

    • figshare.com
    xlsx
    Updated Nov 20, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Trevor Mitcham (2021). Plot Data- figure 4 [Dataset]. http://doi.org/10.6084/m9.figshare.17054204.v1
    Explore at:
    xlsxAvailable download formats
    Dataset updated
    Nov 20, 2021
    Dataset provided by
    figshare
    Figsharehttp://figshare.com/
    Authors
    Trevor Mitcham
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    These are the values used to create the plots in figure 4- plots were generated with the Errorbar function in Matlab. Data are derived from the image files uploaded separately (i.e., Printed Contrast Phantom Data).

  2. H

    PIED Piper: Piper Plots in Matlab

    • hydroshare.org
    • beta.hydroshare.org
    • +1more
    zip
    Updated Jun 3, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Laura Lautz; Christopher Russoniello (2020). PIED Piper: Piper Plots in Matlab [Dataset]. http://doi.org/10.4211/hs.a42ad2a19f074a4b8dbc2ca0869cb9e8
    Explore at:
    zip(4.0 MB)Available download formats
    Dataset updated
    Jun 3, 2020
    Dataset provided by
    HydroShare
    Authors
    Laura Lautz; Christopher Russoniello
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    PIED Piper is a suite of Matlab functions and a standalone GUI compiled for the Windows operating system that allows plotting of Piper Plots. It is the first such code written in the Matlab programming language, one of the most commonly used coding languages for environmental scientists in academia and industry. The code and GUI allow plotting of data as points, or plotting data density with colormaps or contour plots. More information can be found in Russoniello and Lautz (2019), or the PIED Piper Manual.

    The zip file contains MATLAB code, all files needed for installation of the GUI, the data input template, a manual describing the functionality of PIED Piper and several sample data sets.

    Please cite as: (Russoniello and Lautz, 2019) Russoniello, CJ, LK Lautz. 2020. Pay the PIED Piper: Guidelines to visualize large geochemical datasets on Piper Diagrams. Groundwater, 58(3): 464-469. doi: http://10.1111/gwat.12953

  3. T

    Model Fit Plot & Assessment Matlab Scripts

    • dataverse.tdl.org
    txt
    Updated Dec 22, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Logan Trujillo; Logan Trujillo (2022). Model Fit Plot & Assessment Matlab Scripts [Dataset]. http://doi.org/10.18738/T8/J2RF0F
    Explore at:
    txt(6683), txt(13310), txt(17045), txt(16427), txt(10974), txt(7065), txt(8813), txt(15458), txt(4266), txt(5471), txt(5014), txt(16159), txt(13962), txt(12021)Available download formats
    Dataset updated
    Dec 22, 2022
    Dataset provided by
    Texas Data Repository
    Authors
    Logan Trujillo; Logan Trujillo
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    Matlab scripts to plot model fit results + compute auxiliary analyses. These files are associated with the following published study: Trujillo & Anderson (in press). Facial typicality and attractiveness reflect an ideal dimension of face structure. Cognitive Psychology. https://doi.org/10.1016/j.cogpsych.2022.101541. Please properly cite all usage of the files. All version changes to date (12/22/2022) reflect uploading of new files created for revised analyses as part of the peer review of the associated manuscript.

  4. f

    Data underpinning: BPM-Matlab - An open-source optical propagation...

    • datasetcatalog.nlm.nih.gov
    • data.dtu.dk
    Updated Apr 6, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Andersen, Peter E.; Jensen, Stefan Mark; Marti, Dominik; Veettikazhy, Madhu; Hansen, Anders Kragh; Dholakia, Kishan; Andresen, Esben Ravn; Borre, Anja Lykke (2021). Data underpinning: BPM-Matlab - An open-source optical propagation simulation tool in MATLAB [Dataset]. https://datasetcatalog.nlm.nih.gov/dataset?q=0000799949
    Explore at:
    Dataset updated
    Apr 6, 2021
    Authors
    Andersen, Peter E.; Jensen, Stefan Mark; Marti, Dominik; Veettikazhy, Madhu; Hansen, Anders Kragh; Dholakia, Kishan; Andresen, Esben Ravn; Borre, Anja Lykke
    Description

    BPM-Matlab is an open-source optical propagation simulation tool we developed in MATLAB environment for computationally efficient simulation of electric field propagation through a wide variety of optical fiber geometries using Douglas-Gunn Alternating Direction Implicit finite difference method. The validations of BPM-Matlab numerical results are provided in the article by comparing them against published data and results from state-of-the-art commercial software. The simulation tool is gratis, open-source, fast, user-friendly, and supports optional CUDA acceleration. It can be downloaded from https://gitlab.gbar.dtu.dk/biophotonics/BPM-Matlab. The software is published under the terms of the GPLv3 License. The data available here in DTU Data can be used to reproduce the figures 1-5 in the Optics Express manuscript titled 'BPM-Matlab - An open-source optical propagation simulation tool in MATLAB'. These data are generated using BPM-Matlab software except for Data_Fig1_d.mat. We suggest the user to use Matlab 2018a or newer to open and read the data. The data set is published under the terms of the Creative Commons Attribution 4.0 License.Data_Fig1_a_b_c.matThis file can be used to reproduce Fig. 1 (a-c) of the article where BPM-Matlab is used to simulate beam propagation through a multimode fiber. The x and y axes values are available in the variables P.x and P.y. The E-field intensity at the proximal end in Fig. 1(a) can be calculated as abs(P.Einitial.').^2. The corresponding phase in Fig. 1(b) is available as angle(P.Einitial.'). The E-field intensity at the multimode fiber distal end in Fig. 1(c) can be calculated as abs(P.E.field.').^2.Data_Fig1_d.matThe corresponding BeamLab simulation results of the same multimode fiber are available in this data file. This data file is generated using BeamLab software. Use the variables bpmData.SlicesXZ.XData, bpmData.SlicesYZ.YData, and abs(bpmData.OutputField.E.x.').^2 to obtain x, y, and distal E-field intensity respectively.Data_Fig_2.matThe data from this file will generate intensity profiles of the five lowest order fiber modes supported by a straight and a bent multimode fiber corresponding to Figure 2 of the article. The variables P_noBend and P_bend are struct variables that hold information about the spatial dimensions as well as E-field profiles of the straight and bent modes. For the straight fiber case, the mode field profile is stored in P_noBend.modes(modeNumber).field, where 1x = dx*(-(Nx-1)/2:(Nx-1)/2) and y = dy*(-(Ny-1)/2:(Ny-1)/2), where Nx = size(P_noBend.modes(modeNumber).field,1), Ny = size(P_noBend.modes(modeNumber).field,2), dx = P_noBend.modes(modeNumber).Lx/Nx, and dy = P_noBend.modes(modeNumber).Ly/Ny. In a similar manner, the mode field profiles of bent multimode fiber may also be accessed from P_bend. Data_Fig3_a.matUse this data file to reproduce Figure 3(a) from the article, where numerical simulation results of different LP modes' normalized fractional power in a bent multimode fiber excited with LP01 mode are presented. The matlab command semilogy(P.z.*1e3,P.modeOverlaps,'linewidth',2)will plot the mode overlap of LP01 to all 30 guided modes in logarithmic scale. The following command legend(P.modes.label,'location','eastoutside','FontSize',6)could be used to label the modes. Set the y-limits of the plot using ylim([1e-4 2]) to visualize the contribution from only the six most excited modes. Data_Fig3_b.matLoad this data file and follow similar steps described above for Data_Fig3_a case in order to plot normalized fractional power in a bent multimode fiber excited with LP03 mode, as in Figure 3(b). Data_Fig_4.matTo reproduce Figure 4(a) from the article, use the commands imagesc(P.z,P.x,abs(P.xzSlice).^2);ylim([-1 1]*0.75e-5); to plot the intensity profile in the xz plane of a multimode fiber tapered down to be a single-mode fiber. For Figure 4(b), use plot(P.z,P.powers) that will plot the power within the simulation window against the length P.z of the fiber. Data_Fig5_a.matThis data file could be used to plot the intensity profile of the E-field at a distance of z = 5 mm after the non-twisted, straight multicore fiber distal end as given in Figure 5(a) in the article. The E-field data after propagation from the distal end is available as E_out_fft.field and the corresponding spatial dimensions are available as E_out_fft.x and E_out_fft.y. Use imagesc(x.*1e3,y.*1e3,E_abs(E_out_fft.field.').^2); axis image; to plot the field intensity profile. Similar to the above case, use the below .mat files to reproduce Figure 5 (b-d). Data_Fig5_b.mat - Twisted straight multicore fiberData_Fig5_c.mat - Non-twisted bent multicore fiberData_Fig5_d.mat - Twisted bent multicore fiber.

  5. f

    Single cell reconstruction data from MouseLight database and related Matlab...

    • datasetcatalog.nlm.nih.gov
    • figshare.com
    • +1more
    Updated Sep 20, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    chen, susu; Svoboda, Karel; Li, Nuo (2023). Single cell reconstruction data from MouseLight database and related Matlab analysis scripts [Dataset]. https://datasetcatalog.nlm.nih.gov/dataset?q=0000988154
    Explore at:
    Dataset updated
    Sep 20, 2023
    Authors
    chen, susu; Svoboda, Karel; Li, Nuo
    Description

    This document describes how to identify and extract ALM neurons from the MouseLight database of individual reconstructed neurons (https://ml-neuronbrowser.janelia.org/) to define ALM projection zones (relevant to Chen, Liu et al., Cell, 2023). All scripts are in Matlab R2022b./MouseLight_figshare/MouseLightComplete contains all reconstructed single neurons from the MouseLight data set, in .json and .swc formats.Use ‘ExtractMouseLightNeuronsFromJsonFiles.m’ to extract MouseLight neuron ID, soma coordinates and annotation, and axon coordinates from the above directory.Use ‘ExtractMouseLightALMneurons.m’ to identify and extract ALM neurons from the MouseLight data set. ALM neurons are defined based on functional maps of ALM (photoinhibition) in the CCF coordinate system, contained in ‘ALM_functionalData.nii’ (from Li, Daie, et al Nature, 2016).Use ‘ALMprojDensity.m’ to compute and generate an ALM projection map based on axonal density. The map is saved in ‘ALM_mask_150um3Dgauss_Bilateral.mat’ as smoothed (3D Gaussian, sigma = 150 um) axonal density in a 3D matrix: F_smooth.First axis: dorsal-ventral, second axis: medial-lateral, third axis: anterior-posterior.Use ‘medial_lateral_ALMprojDensities.m’ to compute and generate medial and lateral ALM projection maps separately.Medial ALM soma location < 1.5 mm from the midline; lateral ALM soma locations > 1.5 mm from the midline.The maps are saved in ‘medialALM_mask_150um3Dgauss_Bilateral.mat’ and ‘lateralALM_mask_150um3Dgauss_Bilateral.mat’ as smoothed (3D Gaussian, sigma = 150 um) axonal density in 3D matrices, respectively.Use ‘PlotALMinCCF.m’ to plot voxels of ALM in CCF, defined by the functional maps in ‘ALM_functionalData.nii’.Use ‘PlotMouseLightALMneurons.m’ to plot ALM neurons (all, medial, or lateral) in CCF; figures are saved in .tiff format.Other functions:‘loadTifFast.m’ is called to load CCF .tif file (Annotation_new_10_ds222_16bit.tif).‘plotCCFbrain.m’ is called to plot an isosurface of the CCF brain (Annotation_new_10_ds222_16bit.tif).

  6. 4

    Data from: MATLAB scripts, files, and datasets supplementary to the paper...

    • data.4tu.nl
    zip
    Updated Aug 2, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Paolo Tasseron (2024). MATLAB scripts, files, and datasets supplementary to the paper "Riverbank plastic distributions and how to sample them" [Dataset]. http://doi.org/10.4121/3f5d76d4-c361-4158-84f9-fef02b2db898.v1
    Explore at:
    zipAvailable download formats
    Dataset updated
    Aug 2, 2024
    Dataset provided by
    4TU.ResearchData
    Authors
    Paolo Tasseron
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Time period covered
    Feb 2022 - Mar 2022
    Area covered
    The Netherlands
    Description

    This dataset is the supplementary material of Tasseron et al., (2024): "Riverbank Plastic Distributions and how to sample them" [Submitted to a scientific journal for review]. The dataset contains three main folders. The first folder 'MATLAB data' contains all data files [.mat], [.xlsx], and[.csv] required for the analyses performed on the eight riverbanks central in the paper. The second folder 'MATLAB scripts' contains the master script of all analyses [Matlab_master_script.m], the file to calculate Moran's I [MoransI_calculation.m], and a folder with all associated files to plot violin plots in Matlab [Violinplot-Matlab-master]. The third folder 'River-OSPAR datasheets' contains the datasheet used for measurements in the field [Field_sheet_OSPAR_measurement.pdf] and an overview of the riverbank layout [Riverbank_layout.pdf].

  7. Code used in MATLAB and R for the purpose of generating HX difference plots...

    • data.niaid.nih.gov
    • datadryad.org
    zip
    Updated Jan 22, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Emily Smith; Ramya Billur; Marie-France Langelier; Tanaji Talele; John Pascal; Ben Black (2025). Code used in MATLAB and R for the purpose of generating HX difference plots and HX ribbon plots [Dataset]. http://doi.org/10.5061/dryad.qjq2bvqq6
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jan 22, 2025
    Dataset provided by
    Raymond and Ruth Perelman School of Medicine at the University of Pennsylvania
    St. John's University
    Université de Montréal
    St. Jude Children's Research Hospital
    Authors
    Emily Smith; Ramya Billur; Marie-France Langelier; Tanaji Talele; John Pascal; Ben Black
    License

    https://spdx.org/licenses/CC0-1.0.htmlhttps://spdx.org/licenses/CC0-1.0.html

    Description

    PARP1 and PARP2 recognize DNA breaks immediately upon their formation, generate a burst of local PARylation to signal their location, and are co-targeted by all current FDA-approved forms of PARP inhibitors (PARPi) used in the cancer clinic. Recent evidence indicates that the same PARPi molecules impact PARP2 differently from PARP1, raising the possibility that allosteric activation may also differ. We find that unlike for PARP1, destabilization of the autoinhibitory domain of PARP2 is insufficient for DNA damage-induced catalytic activation. Rather, PARP2 activation requires further unfolding of an active site helix. In contrast, the corresponding helix in PARP1 only transiently forms, even prior to engaging DNA. Only one clinical PARPi, Olaparib, stabilizes the PARP2 active site helix, representing a structural feature with the potential to discriminate small molecule inhibitors. Collectively, our findings reveal unanticipated differences in local structure and changes in activation-coupled backbone dynamics between human PARP1 and PARP2. Methods HDExaminer software (v 2.5.0) was used, which uses peptide pool information to identify the deuterated peptides for every sample in the HXMS experiment. The quality of each peptide was further assessed by manually checking mass spectra. The level of HX of each reported deuterated peptide is corrected for loss of deuterium label (back-exchange after quench) during HXMS data collection by normalizing to the maximal deuteration level of that peptide in the fully-deuterated (FD) samples. After normalizing, we then compared the extent of deuteration to the theoretical maximal deuteration (maxD, i.e. if no back-exchange occurs). The data analysis statistics for all the protein states are in Table S2 of Smith-Pillet et al., Mol cell 2025. The difference plots for the deuteration levels between any two samples were obtained through an in-house script written in MATLAB. The script compares the deuteration levels between two samples (e.g. PARP2 and PARP2 with 5’P nicked DNA) and plots the percent difference of each peptide, by subtracting the percent deuteration of PARP2 with 5’P nicked DNA from PARP2 and plotting according to the color legend in stepwise increments. The plot of representative peptide data is shown as the mean of three independent measurements +/- SD. Statistical analysis included a t-test with a P-value <0.05. HX experiments of PARP1 with or without DNA and/or EB-47 have been published. To compare PARP1 and PARP2 datasets, HX samples of PARP1 were repeated in triplicate to have the same peptide digestions and subsequent peptide data, and HX changes in HD peptides were compared between PARP1 and PARP2 with the indicated conditions. HXMS data at 100 s for PARP2 and in the presence of gap DNA, 5’OH nicked DNA, and 5’P nicked DNA was plotted through an in-house script written in R (see Fig. S1A in Smith-Pillet et al., Mol cell 2025).

  8. n

    Data from: Kibble-Zurek dynamics in a trapped ultracold Bose gas

    • data.ncl.ac.uk
    bin
    Updated May 31, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    I-Kang Liu; Nikolaos Proukakis (2023). Kibble-Zurek dynamics in a trapped ultracold Bose gas [Dataset]. http://doi.org/10.25405/data.ncl.12604721.v1
    Explore at:
    binAvailable download formats
    Dataset updated
    May 31, 2023
    Dataset provided by
    Newcastle University
    Authors
    I-Kang Liu; Nikolaos Proukakis
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Readme for 10.25405/data.ncl.12604721 The files are in MATLAB format, and the corresponding MATLAB scripts for Fig. 2, 4 to 11 are included (named by the *.m files). If there is any question/problem, please contact i-kang.liu1@newcastle.ac.uk. p.s. The plotting code, “shadedErrorBarG”, is adopted from the code “shadedErrorBar” written by Rob Campbell, https://www.mathworks.com/matlabcentral/fileexchange/26311-raacampbell-shadederrorbar, with some minor modifications for this work and is attached for the uses of the appended plotting scirpts.=========================================================================== * File Name: eqbm_data.matThis file contains the information for Fig. 2 (a) and Fig. 9. The variables are listed below. Can be easily plotted by using fig_2_a_b.m and fig_9.mFor Fig. 2 (a): “TovTc” is T/T_c,0 where T_c,0 is the condensate transition temperature for ideal Bose gas; “dTovTc” is the deviation of T/T_c,0 due to the particle number deviation in the c-field simulation and is very small. “TovTc_expt” is the T/T_c,0 from experimental measurement; “f0” and “f0_expt” are the condensate fractions of simulation and experiment respectively. “dfc” is the deviation of condensate fraction from simulation; “mu” and “T” are the temperature change from -tau_Q to tau_Q for the background colour.For Fig. 9: “CbPO” and “Cb” are the Binder cumulants computed according to Eq. (A2) and (A1) respectively; “m_order” is the order parameter ; “lcoh” and “dlcoh” are the correlation length and it confidential upper/lower bound from the fit; “ldB” is the thermal de Broglie wavelength computed by the temperature, “T”.===========================================================================* File Name: dync_tauQ.mat “tauQ” are the quench durations considered in this work.* File Name: fig_2_b_and_c.matContains the information for Fig. 2 (b) and (c). For Fig. 2 (b). “time_tc” are the time in t-t_c in ms for tau_Q=150 ms. “time_tc_eqbm” is the time axis for equilibrium data in t-t_c. “lcoh” and “dlcoh” are the correlation length and its standard deviation for tau_Q=150 ms. “lcoh_eqbm” and “dlcoh_eqbm” are the correlation length and its confidential bound for equilibrium data. “DeltaLcoh” and “dDeltaLcoh” are value of Eq. (15) and its errorbar.For Fig. 2(c): “t” is an 1 x 5 array with the time information; “x”, “y” and “z” are the spatial axes; “uPOt” contains 5 PO mode for the snapshots listed in “t”; “aho” is the length unit in meter.It can be easily plotted by the below matlab script for the isosurface plot. The purple line is the high velocity field region, please refer to Ref. [19] for detail. % ----- MATLAB PLOTTING SCRIPT ---- jj = 1, % number of snapshot from Fig. 2 (c) i to v. u = reshape(uPOt(jj,:),[length(y) length(x) length(z)]); [X,Y,Z] = meshgrid(x,y,z); figure, [faces,verts] = isosurface(X*aho*1e6,Y*aho*1e6,Z*aho*1e6,abs(u).^2,1000); patch('Vertices', verts, 'Faces', faces,'FaceColor','g','edgecolor', 'none','FaceAlpha',0.2); [faces,verts] = isosurface(X*aho*1e6,Y*aho*1e6,Z*aho*1e6,abs(u).^2,50); patch('Vertices', verts, 'Faces', faces,'FaceColor','y','edgecolor', 'none','FaceAlpha',0.125); view(3); axis xy equal; camlight head; lighting phong; xlim([-1 1] * 125); title(['t-t_c=' num2str(t(jj)-tc_factor*150) ' ms']) % The values, 1000 and 50, correspond to the green and yellow isosurface in the plots===========================================================================* File Name: dync_main.matThis file contains the most infomraiton of this work for Fig. 4, 5, 6, 8 and 11, including the momentum occupations, spatial densities and density wavefronts for 6 dynamical data. Information is saved in CELL array format, and the j-th cell correspond to the information of j-th tauQ in “dync_tauQ.mat”. The wavefronts are smoothed data after tracing the density.The plots can be reproduced by fig_4_5.m, fig_6.m, fig_8.m and fig_11.m. “aho” the length unit “nk0”, “dnk0” and “dnkl”: Cell arrays for the momentum occupation for k=0 mode “nk0” with its standard deviation, “dnk0” and the lower bound error “dnk0l” for plotting things in log scale, for Fig. 4; “kxp” the k_x axis for the plot in dimensionless unit (a_ho*k_x) for Fig. 5; “nkx” and “dnkx”: Cell arrays for the momentum occupation along k_x axis ”nkx” with its standard deviation “dnkx” for Fig. 5; “xp” the x axis for the plot in the unit of “aho”. “den_x” and “dden_x”: Cell arrays for the spatial density along x axis with the standard deviations in dimensionless unit for Fig. 6. “time_tc”: Cell arrasy for the time axis in t-t_c in ms for the momentum data. For the scale axis in the figures, please read the \hat{t} from “dync_tauQ” by evaluating \hat{t}=\sqrt{t0*tauQ} with t0 =hbar/(gamma*muf) * 1e3 with gamma=5e-4 and muf=22*hbar*w_perp=22*hbar*131.4 Hz and plot above quantities in (time_tc-1.3*that)/tauQ axis. “time_tc_den”: Cell arrasy for the time axis in t-t_c in ms for density data. For the scale axis in the figures, please read the \hat{t} from “dync_tauQ” by evaluating \hat{t}=\sqrt{t0*tauQ} with t0 =hbar/(gamma*muf) * 1e3 with gamma=5e-4 and muf=22*hbar*w_perp=22*hbar*131.4 Hz and plot above quantities in (time_tc-1.3*that)/tauQ axis. “kxp”: The momentum axis in dimesionless unit, a_ho * k_x. “den_x_front” and “t_den_x_front” are the density front and the corresponding time axis respectivley for the x direction in the unit of “aho”; “den_rho_front” and “t_den_rho_front” are the density front and the corresponding time axis respectively for the transverse direction in the unit of “aho”; “den_cen” and “dden_cen” are the central densities and their standard deviations for different quench duraitons for Fig. 6 and Fig. 8 (a). “i_target” the time snapshot index for Fig. 11 (c) and (d). “cmap_nonscale” is the colormap for the left column of Fig. 5; “cmap_k” is the colourmap for the right column of Fig. 5; “cmap_fig_11_a” is the colormap for Fig. 11 (a); “cmap_fig_11_b” is the colormap for Fig. 11 (b).===========================================================================* File Name: dync_lcoh.matContains the information for Fig. 7 and Fig. 10 (b) for the use of fig_7_10.m. “lcoh” and “dlcoh” for the dynamical correlation lengths with their standard deviations for different quench durations. “time_tc” are the time axis in t-t_c in ms.

  9. d

    Simrad acoustic data from RVIB Nathaniel B. Palmer cruise NBP0204 in the...

    • search.dataone.org
    • bco-dmo.org
    Updated Dec 5, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Scott Gallager (2021). Simrad acoustic data from RVIB Nathaniel B. Palmer cruise NBP0204 in the Southern Ocean in 2002 (SOGLOBEC project; Southern Ocean Krill project) [Dataset]. https://search.dataone.org/view/sha256%3A8d76fc2cbefb3e8f825a03cba508a14eb3fec3b53d40940b8a830c2216423a88
    Explore at:
    Dataset updated
    Dec 5, 2021
    Dataset provided by
    Biological and Chemical Oceanography Data Management Office (BCO-DMO)
    Authors
    Scott Gallager
    Area covered
    Southern Ocean
    Description

    Information file for processing and plotting simrad files

    The original telegrams from the simrad EK500 for 38, 120, 200 kHz transducers are transmitted to RVDAS via RS232 at 115kb and logged daily with a new file started at 0000 GMT (2000 h local). To process the raw simrad output, the perl script simandmwaythen.pl (written by Karen Fisher) picks up the day.dat file, which must be set by the user, and loads and outputs a series of files for each frequency and time. The matlab m file, simthenplotter_2.m, picks up the output from the perl script, plots and then saves mat files. The m file called plotsimrad.m asks you for the matlab filename and plots the output.

    All the matlab files for each day of NBP0204 are available in gz zipped format. All the user needs to do is to unzip the mat file of choice, and run plotsimrad.m. You do not need to go back to the raw data if you just wish to plot data for a particular day or time window. The data in the *.mat files are corrected for the offset produced by the noise margin setting on the simrad and calibrated by cross correlation with data from biomaperII when it was in the water at the same time. This is not a true calibration since the accuracy is unknown, but it probably gives correct backscatter intensity within a few db. The range is -100 to -40 and color scaled (thanks to Gareth Lawson) in the plots just as in the output from biomaperII.

    plotsimrad.m allows you to select the times within a given day to plot so you can \"zoom in\" on specific activites such as CTD or MOCNESS casts. The perl script is made available in case the user wishes to go back to the raw data and re-process for some reason. There is also have a script to cross plot backscatter intensity against temperature gradient data from the CTD and CMIPS during each cast. Contact Scott Gallager if you are interested.

    Note about downloading the above program files:
    When you click on the above links, the perl script and the matlab files may be output to your browser window. To download them, use 'save as' at your browser's file menu. You can avoid the screen output and download directly by first holding the 'shift' key and then clicking on the link.

    Other helpful files:
    cmap.m
    initialize_simradplot.m

    16 sept 2002
    Scott Gallager
    sgallager@whoi.edu

  10. d

    Data set for "The magnetic scalar potential and demagnetization vector for a...

    • data.dtu.dk
    txt
    Updated Oct 27, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Rasmus Bjørk (2025). Data set for "The magnetic scalar potential and demagnetization vector for a cylinder tile" [Dataset]. http://doi.org/10.11583/DTU.28067213.v1
    Explore at:
    txtAvailable download formats
    Dataset updated
    Oct 27, 2025
    Dataset provided by
    Technical University of Denmark
    Authors
    Rasmus Bjørk
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    The is the data set for the paper [Bjørk, R., The magnetic scalar potential and demagnetization vector for a cylinder tile, Journal of Magnetism and Magnetic Materials, 632, 173519, 2025]. The DOI for the publication is 10.1016/j.jmmm.2025.173519 .These data are located on the repository at data.dtu.dk, with DOI:10.11583/DTU.28067213 and URL: https://doi.org/10.11583/DTU.28067213 .The folder "Validation_potential_cylinder" contains the finite element validation data for the full cylinder. The folder contains the following files: The file Cylinder_potential.mph is a COMSOL model (version 6.3) that computes the magnetic scalar potential for a fixed geometry using a finite element approach, to compare with the analytical solution. In order to reduce file size, the mesh and solution are removed from the file, and these thus needs to be recomputed. The file Line01.txt to Line17.txt are text files that contains in the first column the arc length (length along the line) for the specific line chosen (there are 17 different lines), and in the next 11 columns contains the magnetic scalar potential for the 11 different magnetization specified in the Matlab file.The folder "Validation_potential_cylinder_slice" contains the finite element validation data for the cylinder slice. The folder contains the following files: The file Cylinder_slice_potential_1.mph is a COMSOL model (version 6.3) that computes the magnetic scalar potential for a fixed geometry (geometry 1) using a finite element approach, to compare with the analytical solution. In order to reduce file size, the mesh and solution are removed from the file, and these thus needs to be recomputed. The files Cylinder_slice_potential_2.mph to Cylinder_slice_potential_6.mph are similar files to Cylinder_slice_potential_1.mph except for different cylinder slice geometries. The files Line01_1.txt to Line17_1.txt are text files with results for geometry 1 from the Comsol simulations (Cylinder_slice_potential_1.mph) that contains in the first column the arc length (length along the line) for the specific line chosen (there are 17 different lines), and in the next 11 columns contains the magnetic scalar potential for the 11 different magnetization specified in the Matlab file. The files Line01_2.txt to Line17_2.txt are similar files for geometry 2, and so on up to "_6".The file MagTense_Validation_cylinder_and_cylindrical_slice_potential.m contains the analytical expressions for the scalar magnetic potential for a cylinder and a cylindrical slice. Its arguments and how to call it can be seen from the two example files Cylinder_potential_specific_line_plot.m and Cylinder_potential_specific_surface_plot.m.The file Cylinder_potential_specific_line_plot.m is a Matlab file that plot Figs. 3 and 5 from the publication by evaluating the scalar potential along the specific lines specified in the publication.The file Cylinder_potential_specific_surface_plot.m is a Matlab file that calculates the scalar potential for a specific geometry and magnetization and saves this for plotting.The file Plot_specific_surface_plot.m is a Matlab file that plot Figs. 2 and 4 from the publication by loading in the scalar potential data saved in the files Full_8_0.01_z_0.2.mat and Slice_4_0.01_z_0.mat and plotting these.The file Full_8_0.01_z_0.2.mat is a Matlab data file containing the magnetic scalar potential for geometry 1 and magnetization direction 8 in the slice z=0.2 m.The file Slice_4_0.01_z_0.mat is a Matlab data file containing the magnetic scalar potential for geometry 1 and magnetization direction 4 in the slice z=0 m.The file ds2nfu.m is a Matlab file that is used to determine coordinates in figures, to plot labels etc.The folder elliptic-master is the Matlab/Octave implementation of Elliptic integrals of three types from Moiseev I., Elliptic functions for Matlab and Octave, (2008), GitHub repository, DOI: http://dx.doi.org/10.5281/zenodo.48264 .All functions are also provided in the MagTense framework, found at www.magtense.org .

  11. Lab and TELEMAC 3D model data of mean flow and turbulence around a monopile

    • zenodo.org
    bin, csv, txt
    Updated Oct 1, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Christopher Adam Unsworth; Christopher Adam Unsworth (2025). Lab and TELEMAC 3D model data of mean flow and turbulence around a monopile [Dataset]. http://doi.org/10.5281/zenodo.15046910
    Explore at:
    csv, bin, txtAvailable download formats
    Dataset updated
    Oct 1, 2025
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Christopher Adam Unsworth; Christopher Adam Unsworth
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Data used in publication of the research. Data is saved as .csv files, code used to plot the data is MATLAB, but saved as plain text files. I have given some tips and notes on using this code and data.

  12. Analysis code and data for FF CO2

    • zenodo.org
    zip
    Updated Sep 2, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Zhe Jin; Yue He; Kai Wang; Zhe Jin; Yue He; Kai Wang (2025). Analysis code and data for FF CO2 [Dataset]. http://doi.org/10.5281/zenodo.17035880
    Explore at:
    zipAvailable download formats
    Dataset updated
    Sep 2, 2025
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Zhe Jin; Yue He; Kai Wang; Zhe Jin; Yue He; Kai Wang
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This folder provides the data and code necessary to reproduce the main findings of the manuscript.

    The code includes both NCL (NCAR Command Language) and MATLAB scripts. Data processing is mainly performed using NCL, while MATLAB is used to plot the ternary diagram in Figure 3.

    The version of NCL used is 6.6.2, and the version of MATLAB is R2022b.


    Introduction to each directory
    data: Contains the basic data required to run the code, as well as the data generated after execution.
    scripts: Contains the code required for data processing and plotting.
    CCGCRV: Includes code for a curve fitting method that is incorporated in a standard software package provided by NOAA Earth System Research Laboratories (https://gml.noaa.gov/aftp/user/thoning/ccgcrv/).


    Introduction to using the scripts:

    Figure 1:
    BRW_MLO_seasonal_index.ncl: Calculate SCA, SZC, and AZC at BRW and MLO
    BRW_MLO_SCA_trend.ncl, BRW_MLO_SZC_trend.ncl, BRW_MLO_AZC_trend.ncl: Calculate trends in SCA, SZC, and AZC at BRW and MLO
    Figure 1 was plotted using Origin 2022, based on data calculated from the NCL scripts above.

    Figure 2:
    Gridded_SCA_trend_HIS_Base.ncl: Calculated the gridded SCA trend in the Northern Hemisphere from the HIS_Base experiment.
    The other NCL scripts starting with "Gridded" serve similar purposes, but the parameters calculated, time periods, and control experiments differ.
    Fig2_SCA_SZC_AZC_trend_His_CMIP6.ncl: The script for plotting Figure 2.

    Figure 3:
    Fig3_contribution.m: Quantitatively characterize the contributions of fossil fuel emissions, terrestrial CO2 fluxes, and oceanic CO2 fluxes to the trends in the CO2 seasonal parameters, and plot a ternary diagram.
    multiVarMapTri.m, tight_subplot.m: Subroutines required for plotting.


    Other notes:
    Since the CO2 concentration data simulated by GEOS-Chem are very large in volume, we only provide the simulation results of the HIS_Base experiment for the year 1979 as an example here. Other data are available from the corresponding author upon reasonable request.

  13. d

    Data set for "Design, optimisation and operation of a high power...

    • data.dtu.dk
    zip
    Updated Aug 30, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Christian Bahl; Arendse Gideon; Mikael Alexander Vinogradov Levy; Jacob Birkjær Marcussen; Rasmus Bjørk (2024). Data set for "Design, optimisation and operation of a high power thermomagnetic harvester" [Dataset]. http://doi.org/10.11583/DTU.26242868.v1
    Explore at:
    zipAvailable download formats
    Dataset updated
    Aug 30, 2024
    Dataset provided by
    Technical University of Denmark
    Authors
    Christian Bahl; Arendse Gideon; Mikael Alexander Vinogradov Levy; Jacob Birkjær Marcussen; Rasmus Bjørk
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    The data set for the article [Bahl, C. R., Engelbrecht, K., Gideon, A., Levy, M. A. V., Marcussen, J. B., Imbaquingo, C., & Bjørk, R. (2024). Design, optimization and operation of a high power thermomagnetic harvester. Applied Energy, 376, 124304.]. DOI for the publication: 10.1016/j.apenergy.2024.124304The data are stored in four zip files, each containing a single folder, according to the different sections in the paper.The folder "Simulation_design" contains three Origin plot files, which contains the data for Figs. 3-5, Fig. 6 and Fig. 7. Both plots and data are contained in the Origin files.The folder "Experiment_thin_coils" contains an Origin file with the data and plot for Fig. 9. Furthermore it contains Matlab scripts for plotting Fig. 10 and Fig. 11 as well as the supporting file "lvm_import.m" for importing the lvm files which contains the raw experimental data. The RMS voltage plotted in Fig. 11 is given in the file "Voltage_RMS.txt".The folder "Experiment_big_coils" contains an Excel sheet with the data shown in Fig. 13, as well as the raw data, Raw_data.xslx, from the experiments needed to produce the average data in the Fig_13_data.xslx file. The data is described within the Excel files.The folder "Experiment_big_coils" contains the raw data in lvm format for the experiments with and without foam. The files are named according to the frequency, flow rate and temperature span and can be read with the "lvm_import.m" file in Matlab.

  14. r

    1000 Empirical Time series

    • researchdata.edu.au
    Updated Oct 11, 2017
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ben Fulcher; Ben Fulcher (2017). 1000 Empirical Time series [Dataset]. http://doi.org/10.6084/m9.figshare.5436136.v9
    Explore at:
    Dataset updated
    Oct 11, 2017
    Dataset provided by
    Monash University
    Authors
    Ben Fulcher; Ben Fulcher
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    A diverse selection of 1000 empirical time series, along with results of an hctsa feature extraction, using v1.03 of hctsa and Matlab 2019b, computed on a linux server at Sydney University.


    The results of the computation are in the hctsa file, HCTSA_Empirical1000.mat for use in Matlab using v1.03 of hctsa.

    The same data is available in .csv format (e.g., for use with non-Matlab computing environments) for the hctsa_datamatrix.csv (results of feature computation), with information about rows (time series) in hctsa_timeseries-info.csv, information about columns (features) in hctsa_features.csv and the data of individual time series (each line a time series, for time series described in hctsa_timeseries-info.csv) is in hctsa_timeseries-data.csv. Note that these files were produced by running >>OutputToCSV(HCTSA_Empirical1000.mat,true); in hctsa.

    The input file, INP_Empirical1000.mat, is for use with hctsa, and contains the time-series data and metadata for the 1000 time series. For example, massive feature extraction from these data on the user's machine, using hctsa, can proceed as
    >> TS_Init('INP_Empirical1000.mat');

    Some visualizations of the dataset are in CarpetPlot.png (first 1000 samples of all time series as a carpet (color) plot) and 150TS-250samples.png (conventional time-series plots of the first 250 samples of a sample of 150 time series from the dataset). More visualizations can be performed by the user using TS_PlotTimeseries from the hctsa package.

    See links in references for more comprehensive documentation for performing methodological comparison using this dataset, and on how to download and use v1.03 of hctsa.

  15. Woodward et al. Graph Plotting Data from Physical flow effects can dictate...

    • rs.figshare.com
    bin
    Updated Jun 1, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    J. R. Woodward; J. W. Pitchford; M. A. Bees (2023). Woodward et al. Graph Plotting Data from Physical flow effects can dictate plankton population dynamics [Dataset]. http://doi.org/10.6084/m9.figshare.8869484.v1
    Explore at:
    binAvailable download formats
    Dataset updated
    Jun 1, 2023
    Dataset provided by
    Royal Societyhttp://royalsociety.org/
    Authors
    J. R. Woodward; J. W. Pitchford; M. A. Bees
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    MATLAB data file for plotting graphs for Figures 5, 6 and 7a

  16. D

    Data from: Indicator from the graph Laplacian of stock market time series...

    • researchdata.ntu.edu.sg
    Updated Sep 23, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Zheng Tien Kang; Zheng Tien Kang (2024). Indicator from the graph Laplacian of stock market time series cross sections can precisely determine the durations of market crashes [Dataset]. http://doi.org/10.21979/N9/7YNZAQ
    Explore at:
    application/x-compressed(4042980746), application/x-compressed(7263573043), application/x-compressed(327987), txt(4855)Available download formats
    Dataset updated
    Sep 23, 2024
    Dataset provided by
    DR-NTU (Data)
    Authors
    Zheng Tien Kang; Zheng Tien Kang
    License

    Attribution-NonCommercial 4.0 (CC BY-NC 4.0)https://creativecommons.org/licenses/by-nc/4.0/
    License information was derived automatically

    Time period covered
    Jan 1, 2019 - Jun 30, 2022
    Dataset funded by
    Ministry of Education, Singapore
    Ministry of Education (MOE)
    Description

    This repository include the processed ultrametric distance matrices data, MATLAB scripts and data holder files (in .mat format) used to generate the results and figures in the PLOS paper with the above title.

  17. Data from: On the role of rheological memory for convection-driven plate...

    • zenodo.org
    tar, txt
    Updated Aug 24, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Lukas Fuchs; Lukas Fuchs; Thorsten W. Becker; Thorsten W. Becker (2022). On the role of rheological memory for convection-driven plate reorganizations [Dataset]. http://doi.org/10.5281/zenodo.6546322
    Explore at:
    tar, txtAvailable download formats
    Dataset updated
    Aug 24, 2022
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Lukas Fuchs; Lukas Fuchs; Thorsten W. Becker; Thorsten W. Becker
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    % ==================================================================== %
    % Data directory for 3D spherical, thermal convection models presented %
    % in Fuchs and Becker (2022) with a strain-dependent weakening and %
    % hardening rheology following the formulation of %
    % Fuchs and Becker (2019). %
    % %
    % The directory contains an input_files, MATLAB_Scripts, and each %
    % model directory with a certain name. %
    % input_file Contains all the input files for CitcomS %
    % MATLAB_Scripts Contains all the MATLAB scripts required to %
    % reproduce the figures in the manuscript %
    % $ModelName Contains a MATLAB directory for individual %
    % data from each model, %
    % a TPR directory for toroidal-poloidal data for %
    % each degree, and, %
    % a txt_data for data picked from CitcomS models %
    % which is then visualized in MATLAB. %
    % For the models discussed in the paper, surface %
    % maps plots and surface grd-files are available %
    % within those directories as well. %
    % ==================================================================== %

    MATLAB_Scripts directory:
    -------------------------
    To visualize certain data for each model you can run the script
    Analyze_Citcom_Models in the MALTAB_Scripts directory, e.g.,

    Analyze_Citcom_Models(Name,PlotParam,PlotParam2,S)

    where,
    Name is the model name as a string variable,
    PlotParam a switch to save (1) or not save (0) the figures,
    PlotParam2 a switch to activate (1) or deactivate (0) plotting,
    S a switch to define the scaling of the model parameter,
    no scaling (0), scaling with the diffusion time scale (1),
    or scaling with the overturn time OT (2).

    With this script one can visualize all the time-dependent data picked
    from the CitcomS models. The CitcomS models can be reproduce with the
    input-files given in the input_files directory.

    To reproduce the box whisker plots in figures 2, 3, S7, and S8 one
    needs to run the script CompStat. This scripts reads in the data
    from all models (from the txt_files directory in the $ModelName
    directory) and creates a box whisker plot for each model period and
    plots them again the average lithospheric damage (gamma_L).

    For more details to each MATLAB script see the help comments within the
    script.

    In case of any questions, do not hesitate to contact me via email:

    lukas.fuchs84 at gmail dot com

    % ==================================================================== %
    % =============================== END ================================ %
    % ==================================================================== %

  18. 4

    Data from: Supplementary data to the article: Impact of fracture geometry...

    • data.4tu.nl
    zip
    Updated May 26, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Weiwei Zhu (2020). Supplementary data to the article: Impact of fracture geometry and topology on the connectivity and flow properties of stochastic fracture networks [Dataset]. http://doi.org/10.4121/uuid:8d54c374-c84e-41f7-9393-93105776724c
    Explore at:
    zipAvailable download formats
    Dataset updated
    May 26, 2020
    Dataset provided by
    4TU.Centre for Research Data
    Authors
    Weiwei Zhu
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    The research is to find the impact of fracture geometry and topology on the connectivity and flow properties of stochastic fracture networks. The data are generated by our in-house developed C++ code. From the shared data, you can: 1. Find the data to reproduce Fig. 7 and 8 in the paper. Those data are averaged by 6000 random realizations for each case; 2. Find the data to plot 2D fracture networks and its graph representation; 3. Find the data to plot 3D fracture networks and its graph representation; 4. Find the data to plot the pressure distribution in 2D fracture networks; 5. Find the data to plot the pressure distribution in 3D fracture networks; 6. Find the corresponding Matlab code to help you visualize the result; To have a better visualization, the system size of 2D ad 3D fracture networks are modified. Those modifications are only for visualization purpose and will not change the results we have in the paper.

  19. m

    Ultimate_Analysis

    • data.mendeley.com
    Updated Jan 28, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Akara Kijkarncharoensin (2022). Ultimate_Analysis [Dataset]. http://doi.org/10.17632/t8x96g88p3.2
    Explore at:
    Dataset updated
    Jan 28, 2022
    Authors
    Akara Kijkarncharoensin
    License

    MIT Licensehttps://opensource.org/licenses/MIT
    License information was derived automatically

    Description

    This database studies the performance inconsistency on the biomass HHV ultimate analysis. The research null hypothesis is the consistency in the rank of a biomass HHV model. Fifteen biomass models are trained and tested in four datasets. In each dataset, the rank invariability of these 15 models indicates the performance consistency.

    The database includes the datasets and source codes to analyze the performance consistency of the biomass HHV. These datasets are stored in tabular on an excel workbook. The source codes are the biomass HHV machine learning model through the MATLAB Objected Orient Program (OOP). These machine learning models consist of eight regressions, four supervised learnings, and three neural networks.

    An excel workbook, "BiomassDataSetUltimate.xlsx," collects the research datasets in six worksheets. The first worksheet, "Ultimate," contains 908 HHV data from 20 pieces of literature. The names of the worksheet column indicate the elements of the ultimate analysis on a % dry basis. The HHV column refers to the higher heating value in MJ/kg. The following worksheet, "Full Residuals," backups the model testing's residuals based on the 20-fold cross-validations. The article (Kijkarncharoensin & Innet, 2021) verifies the performance consistency through these residuals. The other worksheets present the literature datasets implemented to train and test the model performance in many pieces of literature.

    A file named "SourceCodeUltimate.rar" collects the MATLAB machine learning models implemented in the article. The list of the folders in this file is the class structure of the machine learning models. These classes extend the features of the original MATLAB's Statistics and Machine Learning Toolbox to support, e.g., the k-fold cross-validation. The MATLAB script, name "runStudyUltimate.m," is the article's main program to analyze the performance consistency of the biomass HHV model through the ultimate analysis. The script instantly loads the datasets from the excel workbook and automatically fits the biomass model through the OOP classes.

    The first section of the MATLAB script generates the most accurate model by optimizing the model's higher parameters. It takes a few hours for the first run to train the machine learning model via the trial and error process. The trained models can be saved in MATLAB .mat file and loaded back to the MATLAB workspace. The remaining script, separated by the script section break, performs the residual analysis to inspect the performance consistency. Furthermore, the figure of the biomass data in the 3D scatter plot, and the box plots of the prediction residuals are exhibited. Finally, the interpretations of these results are examined in the author's article.

    Reference : Kijkarncharoensin, A., & Innet, S. (2022). Performance inconsistency of the Biomass Higher Heating Value (HHV) Models derived from Ultimate Analysis [Manuscript in preparation]. University of the Thai Chamber of Commerce.

  20. H

    Replication Data for: Long-Range Attraction between Graphene and Water/Oil...

    • dataverse.harvard.edu
    • search.dataone.org
    Updated Jan 2, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Avishi Abeywickrama; Douglas H. Adamson; Hannes C. Schniepp (2024). Replication Data for: Long-Range Attraction between Graphene and Water/Oil Interfaces [Dataset]. http://doi.org/10.7910/DVN/MEBRW1
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Jan 2, 2024
    Dataset provided by
    Harvard Dataverse
    Authors
    Avishi Abeywickrama; Douglas H. Adamson; Hannes C. Schniepp
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Dataset funded by
    National Science Foundation
    Description

    Atomic force microscopy (AFM) image raw data, force spectroscopy raw data, data analysis/data plotting, and force modeling. File Formats The raw files of the AFM imaging scans of the colloidal probe surface are provided in NT-MDT's proprietary .mdt file format, which can be opened using the Gwyddion software package. Gwyddion has been released under the GNU public software license GPLv3 and can be downloaded free of charge at http://gwyddion.net/. The processed image files are included in Gwyddion's .gwy file format. Force spectroscopy raw files are also provided in .mdt file format, which can be opened using NT-MDT's NOVA Px software (we used 3.2.5 rev. 10881). All the force data were converted to ASCII files (*.txt) using the NOVA Px software to also provide them in human readable form with this data set. The MATLAB codes used for force curve processing and data analysis are given as *.m files and can be opened by MATLAB (https://www.mathworks.com/products/matlab) or by a text editor. The raw and processed force curve data and other values used for data processing are stored in binary form in .mat MATLAB data files, which can be opened by MATLAB. Organized by figure, all the raw and processed force curve data are given in Excel worksheets (.xlsx), one per probe/substrate combination. Data (Folder Structure) The data in the dataverse is best viewed in "Tree" mode. Codes for Force Curve Processing The three MATLAB codes used for force curve processing are contained in this folder. The text file Read me.txt provides all the instructions to process raw force data using these three MATLAB codes. Figure 3B, 3C – AFM images The raw (.mdt) and processed (.gwy) AFM images of the colloidal probe before and after coating with graphene oxide (GO) are contained in this folder. Figure 4 – Force Curve GO The raw data of the force curve shown in Figure 4 and the substrate force curve data (used to find inverse optical lever sensitivity) are given as .mdt files and were exported as ASCII files given in the same folder. The raw and processed force curve data are also given in the variables_GO_Tip 18.mat and GO_Tip 18.xlsx files. The force curve processing codes and instructions can be found in the "Codes for Force Curve Processing" folder, as mentioned above. Figure 5A – Force–Displacement Curves GO, rGO1, rGO10 All the raw data of the force curves (GO, rGO1, rGO10) shown in Figure 5A and the corresponding substrate force curve data (used to find inverse optical lever sensitivity) are given as .mdt files and were exported as ASCII files given in the same folder. The raw and processed force curve data are also given in *.mat and *.xlsx files. Figure 5B, 5C – Averages of Force and Displacement for Snap-On and Pull-Off Events All the raw data of the force curves (GO, rGO1, rGO10) for all the probes and corresponding substrate force curve data are given as .mdt files and were exported as ASCII files given in this folder. The raw and processed force curve data are also provided in *.mat and *.xlsx files. The snap-on force, snap-on displacement, and pull-off displacement values were obtained from each force curve and averaged as in Code_Figure5B_5C.m. The same code was used for plotting the average values. Figure 6A – Force–Distance Curves GO, rGO1, rGO10 The raw data provided in "Figure 5A – Force Displacement Curves GO, rGO1, rGO10" folder were processed into force-vs-distance curves. The raw and processed force curve data are also given in *.mat and *.xlsx files. Figure 6B – Average Snap-On and Pull-Off Distances The same raw data provided in "Figure 5B, 5C – Average Snap on Force, Displacement, Pull off Displacement" folder were processed into force-vs-distance curves. The raw and processed force curve data of GO, rGO1, rGO10 of all the probes are also given in *.mat and *.xlsx files. The snap-on distance and pull-off distance values were obtained from each force curve and averaged as in Code_Figure6B.m. The code used for plotting is also given in the same text file. Figure 6C – Contact Angles Advancing and receding contact angles were calculated using each processed force-vs-distance curve and averaged according to the reduction time. The obtained values and the code used to plot is given in Code_Figure6C.m. Figure 9A – Force Curve Repetition The raw data of all five force curves and the substrate force curve data are given as .mdt files and were exported as ASCII files given in the same folder. The raw and processed force curve data are also given in *.mat and *.xlsx files. Figure 9B – Repulsive Force Comparison The data of the zoomed-in region of Figure 9A was plotted as "Experimental curve". Initial baseline correction was done using the MATLAB code bc.m, and the procedure is given in the Read Me.txt text file. All the raw and processed data are given in rGO10_Tip19_Trial1.xlsx and variables_rGO10_Tip 19.mat files. The MATLAB code used to model other forces and plot all the curves in Figure 9B is given in...

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Trevor Mitcham (2021). Plot Data- figure 4 [Dataset]. http://doi.org/10.6084/m9.figshare.17054204.v1
Organization logoOrganization logo

Plot Data- figure 4

Explore at:
33 scholarly articles cite this dataset (View in Google Scholar)
xlsxAvailable download formats
Dataset updated
Nov 20, 2021
Dataset provided by
figshare
Figsharehttp://figshare.com/
Authors
Trevor Mitcham
License

Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically

Description

These are the values used to create the plots in figure 4- plots were generated with the Errorbar function in Matlab. Data are derived from the image files uploaded separately (i.e., Printed Contrast Phantom Data).

Search
Clear search
Close search
Google apps
Main menu