Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This section presents a discussion of the research data. The data was received as secondary data however, it was originally collected using the time study techniques. Data validation is a crucial step in the data analysis process to ensure that the data is accurate, complete, and reliable. Descriptive statistics was used to validate the data. The mean, mode, standard deviation, variance and range determined provides a summary of the data distribution and assists in identifying outliers or unusual patterns. The data presented in the dataset show the measures of central tendency which includes the mean, median and the mode. The mean signifies the average value of each of the factors presented in the tables. This is the balance point of the dataset, the typical value and behaviour of the dataset. The median is the middle value of the dataset for each of the factors presented. This is the point where the dataset is divided into two parts, half of the values lie below this value and the other half lie above this value. This is important for skewed distributions. The mode shows the most common value in the dataset. It was used to describe the most typical observation. These values are important as they describe the central value around which the data is distributed. The mean, mode and median give an indication of a skewed distribution as they are not similar nor are they close to one another. In the dataset, the results and discussion of the results is also presented. This section focuses on the customisation of the DMAIC (Define, Measure, Analyse, Improve, Control) framework to address the specific concerns outlined in the problem statement. To gain a comprehensive understanding of the current process, value stream mapping was employed, which is further enhanced by measuring the factors that contribute to inefficiencies. These factors are then analysed and ranked based on their impact, utilising factor analysis. To mitigate the impact of the most influential factor on project inefficiencies, a solution is proposed using the EOQ (Economic Order Quantity) model. The implementation of the 'CiteOps' software facilitates improved scheduling, monitoring, and task delegation in the construction project through digitalisation. Furthermore, project progress and efficiency are monitored remotely and in real time. In summary, the DMAIC framework was tailored to suit the requirements of the specific project, incorporating techniques from inventory management, project management, and statistics to effectively minimise inefficiencies within the construction project.
Attribution-NonCommercial 4.0 (CC BY-NC 4.0)https://creativecommons.org/licenses/by-nc/4.0/
License information was derived automatically
The design and synthesis of a series of 2,7-diazaspiro[4.4]nonane derivatives as potent sigma receptor (SR) ligands, associated with analgesic activity, are the focus of this work. In this study, affinities at S1R and S2R were measured, and molecular modeling studies were performed to investigate the binding pose characteristics. The most promising compounds were subjected to in vitro toxicity testing and subsequently screened for in vivo analgesic properties. Compound 9d (AD258) exhibited negligible in vitro cellular toxicity and a high binding affinity to both SRs (KiS1R = 3.5 nM, KiS2R = 2.6 nM), but not for other pain-related targets, and exerted high potency in a model of capsaicin-induced allodynia, reaching the maximum antiallodynic effect at very low doses (0.6–1.25 mg/kg). Functional activity experiments showed that S1R antagonism is needed for the effects of 9d and that it did not induce motor impairment. In addition, 9d exhibited a favorable pharmacokinetic profile.
MIT Licensehttps://opensource.org/licenses/MIT
License information was derived automatically
Dataset Documentation
Overview
Dataset Name: sigma_dataset Short Description: This dataset consists of meteorological (time series) and geophysical (catchment attributes) data of 85 basins of Kazakhstan. It is intended for use in weather forecasting or modeling, as well as flood prediction based on the attributes provided. Long Description: We developed basin scale hydrometeorological forcing data for 85 basins in the conterminous Kazakhstan basin subset. Retrospective… See the full description on the dataset page: https://huggingface.co/datasets/floodpeople/sigma_dataset.
https://vocab.nerc.ac.uk/collection/L08/current/UN/https://vocab.nerc.ac.uk/collection/L08/current/UN/
The World Ocean Isopycnal-Level Velocity (WOIL-V) climatology was derived from the United States Navy's Generalised Digital Environmental Model (GDEM) temperature and salinity profiles, using the P-Vector Method. The absolute velocity data have the same horizontal resolution and temporal variation (annual, monthly) as GDEM (T, S) fields. These data have an horizontal resolution of 0.5 degrees ×0.5 degrees, and 222 isopycnal-levels (sigma theta levels) from sigma theta = 22.200 to 27.725 (kg m-3) with the increment delta sigma theta = 0.025 (kg m-3), however in the equatorial zone (5 degrees S – 5 degrees N) they are questionable due to the geostrophic balance being the theoretical base for the P-vector inverse method. The GDEM model, which served as the base for the calculations includes data from 1920s onwards and the WOIL-V will be updated with the same frequency as the GDEM. The climatological velocity field on isopycnal surface is dynamically compatible to the GDEM (T, S) fields and provides background ocean currents for oceanographic and climatic studies, especially in ocean isopycnal modeling. The climatology was prepared by the Department of Oceanography, Naval Postgraduate School.
NRL HYCOM 1/25 deg model output, Gulf of Mexico, 10.04 Expt 31.0, 2009-2014, At Surface The HYCOM consortium is a multi-institutional effort sponsored by the National Ocean Partnership Program (NOPP), as part of the U. S. Global Ocean Data Assimilation Experiment (GODAE), to develop and evaluate a data-assimilative hybrid isopycnal-sigma-pressure (generalized) coordinate ocean model (called HYbrid Coordinate Ocean Model or HYCOM).
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
The files contain outputs from numerical simulations using a combination of the numerical models Geospace Environment Model of Ion-Neutral Interactions (GEMINI) and Satellite-beacon Ionospheric- scintillation Global Model of the upper Atmosphere (SIGMA). The outputs provide simulated time series of Global Positioning System (GPS) scintillations through density structures generated by the Kelvin Helmholtz instability (KHI), as explained in detail in the publication. The numerical codes used to generate the outputs are described in the following publications: • Zettergren, M., Semeter, J., & Dahlgren, H. (2015). “Dynamics of density cavities generated by frictional heating: Formation, distortion, and instability”. Geophysical Research Letters, 42(23). [Available at https://agupubs.onlinelibrary.wiley.com/doi/full/10.1002/2015GL066806 ] • Deshpande, K. B., Bust, G. S., Clauer, C. R., Rino, C. L., & Carrano, C. S. (2014). “Satellite-beacon Ionospheric-scintillation Global Model of the upper Atmosphere (SIGMA) I: High latitude sensitivity study of the model parameters”. Journal of Geophysical Research: Space Physics, 119, 4026-4043. doi:10.1002/2013JA019699811. [Available at https://agupubs.onlinelibrary.wiley.com/doi/full/10.1002/2013JA019699 ] • Deshpande, K. B., & Zettergren, M. D. (2019). “Satellite-Beacon Ionospheric Scintillation Global Model of the Upper Atmosphere (SIGMA) III: Scintillation Simulation Using A Physics-Based Plasma Model”. Geophysical Research Letters, 46(9), 4564-4572. doi:10.1029/2019GL082576. [Available at https://agupubs.onlinelibrary.wiley.com/doi/full/10.1029/2019GL082576 ] GEMINI is free, open-source software and can be downloaded from GitHub at https://github.com/gemini3d/. Build instructions, example simulations, and documentation are also included on this website. Uploaded by A.S.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
BackgroundMultiple studies have indicated that the minimal model of hip structure can enhance hip fracture risk assessment. This study aimed to investigate the independent association between minimal model variables and hip fracture risk in Han Chinese individuals.MethodsThis cross-sectional study included 937 Han Chinese patients (248 with hip fractures). Minimal model variables were calculated from the hip structural analysis, including bone mineral density (BMD), femoral neck width (FNW), and Delta and Sigma values.ResultsThis study included 937 patients (293 men; mean age = 68.3 years). In logistic regression analyses, BMD increase (per 0.1 g/cm2) correlated with a 45% reduction in the hip fracture risk (odds ratio [OR] = 0.55; 95% confidence interval [CI]: 0.45–0.68) after adjusting for all covariates. However, FNW (per 0.1 cm) and Sigma (per 0.01 cm) and Delta values (per 0.01 cm) were associated with increased risks (OR = 1.28; 95% CI: 1.18–1.37; OR = 1.06; 95% CI: 1.03–1.09; OR = 1.06; 95% CI: 1.03–1.09, respectively). When the Delta was >0.17 cm, the risk of hip fracture rose considerably by 13% (OR = 1.13; 95% CI: 1.08–1.18) for every 0.01 cm that the Delta value increased. The area under the curve (AUC) for hip fracture prediction from BMD alone was significantl lower than those of minimal model (0.781 vs 0.838, p
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
This dataset contains 3 sets of 50 realizations of hydrostratigrahy in Egebjerg, Denmark following the GDM method (https://doi.org/10.1016/j.enggeo.2022.106833). The three sets vary in the smoothing factor (sigma) of the Low frequency - model. This dataset is used the importance of uncertainty level in geological interpretation modelling in the following paper ("Incorporating interpretation uncertainties from deterministic 3D hydrostratigraphic models in groundwater models", https://doi.org/10.5194/hess-2023-74). The three scenarios are parameterized as follows: 1) a low uncertainty scenario with sigma = 2 and the uncertainties from the GDM paper divided by three; 2) a medium uncertainty scenario with sigma = 7 and the uncertainties corresponding to the values from the GDM paper; 3) a high uncertainty scenario with sigma = 12 and the uncertainties corresponding to the values from the GDM paper multiplied by 3.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Coupled Model Intercomparison Project Phase 6 (CMIP6) datasets. These data include all datasets published for 'CMIP6.CMIP.INM.INM-CM4-8.1pctCO2' with the full Data Reference Syntax following the template 'mip_era.activity_id.institution_id.source_id.experiment_id.member_id.table_id.variable_id.grid_label.version'.
The INM-CM4-8 climate model, released in 2016, includes the following components: aerosol: INM-AER1, atmos: INM-AM4-8 (2x1.5; 180 x 120 longitude/latitude; 21 levels; top level sigma = 0.01), land: INM-LND1, ocean: INM-OM5 (North Pole shifted to 60N, 90E; 360 x 318 longitude/latitude; 40 levels; sigma vertical coordinate), seaIce: INM-ICE1. The model was run by the Institute for Numerical Mathematics, Russian Academy of Science, Moscow 119991, Russia (INM) in native nominal resolutions: aerosol: 100 km, atmos: 100 km, land: 100 km, ocean: 100 km, seaIce: 100 km.
Project: These data have been generated as part of the internationally-coordinated Coupled Model Intercomparison Project Phase 6 (CMIP6; see also GMD Special Issue: http://www.geosci-model-dev.net/special_issue590.html). The simulation data provides a basis for climate research designed to answer fundamental science questions and serves as resource for authors of the Sixth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC-AR6).
CMIP6 is a project coordinated by the Working Group on Coupled Modelling (WGCM) as part of the World Climate Research Programme (WCRP). Phase 6 builds on previous phases executed under the leadership of the Program for Climate Model Diagnosis and Intercomparison (PCMDI) and relies on the Earth System Grid Federation (ESGF) and the Centre for Environmental Data Analysis (CEDA) along with numerous related activities for implementation. The original data is hosted and partially replicated on a federated collection of data nodes, and most of the data relied on by the IPCC is being archived for long-term preservation at the IPCC Data Distribution Centre (IPCC DDC) hosted by the German Climate Computing Center (DKRZ).
The project includes simulations from about 120 global climate models and around 45 institutions and organizations worldwide. - Project website: https://pcmdi.llnl.gov/CMIP6.
Sediment accretion and subduction at convergent margins play an important role in the nature of hazardous interplate seismicity (the seismogenic zone) and the subduction recycling of volatiles and continentally derived materials to the Earth's mantle. Identifying and quantifying sediment accretion, essential for a complete mass balance across the margin, can be difficult. Seismic images do not define the processes by which a prism was built, and cored sediments may show disturbed magnetostratigraphy and sparse biostratigraphy. This contribution reports the first use of cosmogenic 10Be depth profiles to define the origin and structural evolution of forearc sedimentary prisms. Biostratigraphy and 10Be model ages generally are in good agreement for sediments drilled at Deep Sea Drilling Project Site 434 in the Japan forearc, and support an origin by imbricate thrusting for the upper section. Forearc sediments from Ocean Drilling Program Site 1040 in Costa Rica lack good fossil or paleomagnetic age control above the decollement. Low and homogeneous 10Be concentrations show that the prism sediments are older than 3-4 Ma, and that the prism is either a paleoaccretionary prism or it formed largely from slump deposits of apron sediments. Low 10Be in Costa Rican lavas and the absence of frontal accretion imply deeper sediment underplating or subduction erosion.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Data archived here are the external iron input data and model output data discussed in a paper entitled "Slowly sinking particles underlie dissolved iron transport across the Pacific Ocean" submitted to Global Biogeochemical Cycles. The model used in this study was developed by coupling Regional Ocean Modeling System (Shchepetkin and McWilliams, 2005) and Biogeochemical Elemental Cycling model (Moore et al., 2013). The model covers the whole North Pacific Ocean. The model horizontal resolution was set to 1/4° mesh. The external iron input data are iron fluxes due to atmospheric deposition and dissolution from seabed sediments. The model output data are dissolved iron concentrations simulated by the model and were only presented for the data in the intermediate layer (26.6-27.4 sigma-theta divided by 0.02 sigma-theta). The simulated data were regridded 1° mesh to reduce the size of the data. The model was calculated for 100 years and the simulated dissolved iron concentration are in quasi-steady state. For more details about the individual archived data, please refer to README.pdf included in the data. Reference Shchepetkin, A. F., & McWilliams, J. C. (2005). The regional oceanic modeling system (ROMS): A split-explicit, free-surface, topography-following-coordinate oceanic model. Ocean Modelling, 9(4), 347-404. Moore, J. K., Lindsay, K., Doney, S. C., Long, M. C., & Misumi, K. (2013). Marine ecosystem dynamics and biogeochemical cycling in the Community Earth System Model (CESM1-BGC). Journal of Climate, 26, 9291-9312.
https://www.kappasignal.com/p/legal-disclaimer.htmlhttps://www.kappasignal.com/p/legal-disclaimer.html
This analysis presents a rigorous exploration of financial data, incorporating a diverse range of statistical features. By providing a robust foundation, it facilitates advanced research and innovative modeling techniques within the field of finance.
Historical daily stock prices (open, high, low, close, volume)
Fundamental data (e.g., market capitalization, price to earnings P/E ratio, dividend yield, earnings per share EPS, price to earnings growth, debt-to-equity ratio, price-to-book ratio, current ratio, free cash flow, projected earnings growth, return on equity, dividend payout ratio, price to sales ratio, credit rating)
Technical indicators (e.g., moving averages, RSI, MACD, average directional index, aroon oscillator, stochastic oscillator, on-balance volume, accumulation/distribution A/D line, parabolic SAR indicator, bollinger bands indicators, fibonacci, williams percent range, commodity channel index)
Feature engineering based on financial data and technical indicators
Sentiment analysis data from social media and news articles
Macroeconomic data (e.g., GDP, unemployment rate, interest rates, consumer spending, building permits, consumer confidence, inflation, producer price index, money supply, home sales, retail sales, bond yields)
Stock price prediction
Portfolio optimization
Algorithmic trading
Market sentiment analysis
Risk management
Researchers investigating the effectiveness of machine learning in stock market prediction
Analysts developing quantitative trading Buy/Sell strategies
Individuals interested in building their own stock market prediction models
Students learning about machine learning and financial applications
The dataset may include different levels of granularity (e.g., daily, hourly)
Data cleaning and preprocessing are essential before model training
Regular updates are recommended to maintain the accuracy and relevance of the data
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
`*.npz` and `*.asdf` files containing visibilities are in the TMS format (opposite that of CASA).
logo_cube.noise.npz visibilities have been rescaled such that data - model / sigma follows the expected Gaussian envelope.
HD 143006 continuum visibilities have flagged outliers removed and weights rescaled such that the data - model / sigma follows the expected Gaussian envelope, for each spectral window.
AS 209 continuum visibilities have been averaged across frequency and have their weights rescaled such that the data - model / sigma follows the expected Gaussian envelope, for each spectral window.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
These data include all datasets published for 'CMIP6.CMIP.INM.INM-CM4-8.historical' with the full Data Reference Syntax following the template 'mip_era.activity_id.institution_id.source_id.experiment_id.member_id.table_id.variable_id.grid_label.version'. The INM-CM4-8 climate model, released in 2016, includes the following components: aerosol: INM-AER1, atmos: INM-AM4-8 (2x1.5; 180 x 120 longitude/latitude; 21 levels; top level sigma = 0.01), land: INM-LND1, ocean: INM-OM5 (North Pole shifted to 60N, 90E; 360 x 318 longitude/latitude; 40 levels; sigma vertical coordinate), seaIce: INM-ICE1. The model was run by the Institute for Numerical Mathematics, Russian Academy of Science, Moscow 119991, Russia (INM) in native nominal resolutions: aerosol: 100 km, atmos: 100 km, land: 100 km, ocean: 100 km, seaIce: 100 km.
Individuals using the data must abide by terms of use for CMIP6 data (https://pcmdi.llnl.gov/CMIP6/TermsOfUse). The original license restrictions on these datasets were recorded as global attributes in the data files, but these may have been subsequently updated.
The $p_{\rm T}$-differential production cross sections of prompt D$^{0}$, $\Lambda_{\rm c}^{+}$, and $\Sigma_{\rm c}^{0,++}(2455)$ charmed hadrons are measured at midrapidity ($|y| < 0.5$) in pp collisions at $\sqrt{s} = 13$ TeV. This is the first measurement of $\Sigma_{\rm c}^{0,++}$ production in hadronic collisions. Assuming the same production yield for the three $\Sigma_{\rm c}^{0,+,++}$ isospin states, the baryon-to-meson cross section ratios $\Sigma_{\rm c}^{0,+,++}/{\rm D}^{0}$ and $\Lambda_{\rm c}^{+}/{\rm D}^{0}$ are calculated in the transverse momentum ($p_{\rm T}$) intervals $2 < p_{\rm T} < 12$ GeV/$c$ and $1 < p_{\rm T} < 24$ GeV/$c$. Values significantly larger than in e$^{+}$e$^{-}$ collisions are observed, indicating for the first time that baryon enhancement in hadronic collisions also extends to the $\Sigma_{\rm c}$. The feed-down contribution to $\Lambda_{\rm c}^{+}$ production from $\Sigma_{\rm c}^{0,+,++}$ is also reported and is found to be larger than in e$^{+}$e$^{-}$ collisions. The data are compared with predictions from event generators and other phenomenological models, providing a sensitive test of the different charm-hadronisation mechanisms implemented in the models.
Radionuclide concentrations were studied in sediment cores taken at the continental slope of the Philippine Sea off Mindanao Island in the equatorial Western Pacific. High resolution deposition records of anthropogenic radionuclides were collected at this site. Excess 210Pb together with excess 228Th and anthropogenic radionuclides provided information about accumulation rates. Concentrations of Am and Pu isotopes were detected by gamma spectrometry, alpha spectrometry and ICP-MS. The Pu ratios indicate a high portion (minimum of 60%) of Pu from the Pacific Proving Grounds (PPG). This implies that the transport of PPG derived plutonium with the Mindanao Current southward is similarly effective as the previously known transport towards the north with the Kuroshio Current. The record is compared to other studies from northwest Pacific marginal seas and Lombok basin in the Indonesian Archipelago. The sediment core top was found to contain a 6 cm thick layer dominated by terrestrial organic matter, which was interpreted as a result of the 2012 Typhoon Pablo-related fast deposition.
Open Database License (ODbL) v1.0https://www.opendatacommons.org/licenses/odbl/1.0/
License information was derived automatically
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
These data are used to develop the pipeline based several models to predict sigma-phase fraction and impact strength of SDS steel. The data correspond to the article "Prediction of the critical cooling rate in the dependence of the chemical composition of super duplex steel X03Cr23Ni6Mo4Cu3NbN"
After many years of research and technical preparation, the production of a new ECMWF climate reanalysis to replace ERA-Interim is in progress. ERA5 is the fifth generation of ECMWF atmospheric reanalyses of the global climate, which started with the FGGE reanalyses produced in the 1980s, followed by ERA-15, ERA-40 and most recently ERA-Interim. ERA5 will cover the period January 1950 to near real time. ERA5 is produced using high-resolution forecasts (HRES) at 31 kilometer resolution (one fourth the spatial resolution of the operational model) and a 62 kilometer resolution ten member 4D-Var ensemble of data assimilation (EDA) in CY41r2 of ECMWF's Integrated Forecast System (IFS) with 137 hybrid sigma-pressure (model) levels in the vertical, up to a top level of 0.01 hPa. Atmospheric data on these levels are interpolated to 37 pressure levels (the same levels as in ERA-Interim). Surface or single level data are also available, containing 2D parameters such as precipitation, 2 meter temperature, top of atmosphere radiation and vertical integrals over the entire atmosphere. The IFS is coupled to a soil model, the parameters of which are also designated as surface parameters, and an ocean wave model. Generally, the data is available at an hourly frequency and consists of analyses and short (12 hour) forecasts, initialized twice daily from analyses at 06 and 18 UTC. Most analyses parameters are also available from the forecasts. There are a number of forecast parameters, for example mean rates and accumulations, that are not available from the analyses. Improvements to ERA5, compared to ERA-Interim, include use of HadISST.2, reprocessed ECMWF climate data records (CDR), and implementation of RTTOV11 radiative transfer. Variational bias corrections have not only been applied to satellite radiances, but also ozone retrievals, aircraft observations, surface pressure, and radiosonde profiles. Please note: DECS is producing a CF 1.6 compliant netCDF-4/HDF5 version of ERA5...
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
NOAA’s National Geodetic Survey (NGS) is in the process of modernizing the National Spatial Reference System (NSRS). As part of NSRS Modernization, the North American Vertical Datum of 1988 (NAVD88) will be replaced by the North American-Pacific Geopotential Datum of 2022 (NAPGD2022). This is the estimated uncertainty for the SGEOID2022 model provided in meters at the one-sigma level. This layer is intended for NGS customers, stakeholders, partners, and other constituents to view and analyze the alpha models. These models are preliminary and should not be considered final as they can change.Alpha NAPGD2022 ExperienceAlpha GEOID2022 Web MapData SourcesInformation about NAPGD2022 can be found on the NAPG2022 Alpha web pages.https://alpha.ngs.noaa.gov/NAPGD2022/Preliminary models can be downloaded at the NAPGD2022 Alpha web page.https://alpha.ngs.noaa.gov/NAPGD2022/download.shtmlModel InformationThis model is provided as a global grid of estimates originating from the EGM2008 model uncertainties for the geoid (Pavlis et al. 2012). The estimated uncertainties have been upgraded throughout the North American-Pacific model to reflect the surface gravity data coverage, accuracy, and terrain into the model. Point of ContactPlease email the NGS Information Center for any questions at ngs.infocenter@noaa.gov
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This section presents a discussion of the research data. The data was received as secondary data however, it was originally collected using the time study techniques. Data validation is a crucial step in the data analysis process to ensure that the data is accurate, complete, and reliable. Descriptive statistics was used to validate the data. The mean, mode, standard deviation, variance and range determined provides a summary of the data distribution and assists in identifying outliers or unusual patterns. The data presented in the dataset show the measures of central tendency which includes the mean, median and the mode. The mean signifies the average value of each of the factors presented in the tables. This is the balance point of the dataset, the typical value and behaviour of the dataset. The median is the middle value of the dataset for each of the factors presented. This is the point where the dataset is divided into two parts, half of the values lie below this value and the other half lie above this value. This is important for skewed distributions. The mode shows the most common value in the dataset. It was used to describe the most typical observation. These values are important as they describe the central value around which the data is distributed. The mean, mode and median give an indication of a skewed distribution as they are not similar nor are they close to one another. In the dataset, the results and discussion of the results is also presented. This section focuses on the customisation of the DMAIC (Define, Measure, Analyse, Improve, Control) framework to address the specific concerns outlined in the problem statement. To gain a comprehensive understanding of the current process, value stream mapping was employed, which is further enhanced by measuring the factors that contribute to inefficiencies. These factors are then analysed and ranked based on their impact, utilising factor analysis. To mitigate the impact of the most influential factor on project inefficiencies, a solution is proposed using the EOQ (Economic Order Quantity) model. The implementation of the 'CiteOps' software facilitates improved scheduling, monitoring, and task delegation in the construction project through digitalisation. Furthermore, project progress and efficiency are monitored remotely and in real time. In summary, the DMAIC framework was tailored to suit the requirements of the specific project, incorporating techniques from inventory management, project management, and statistics to effectively minimise inefficiencies within the construction project.