100+ datasets found
  1. k

    Proton-Proton-Collision-process-at-LHC-simulations

    • kaggle.com
    Updated Sep 15, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2022). Proton-Proton-Collision-process-at-LHC-simulations [Dataset]. https://www.kaggle.com/datasets/shirshmall/lhc-events-ppee-ppmumu
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Sep 15, 2022
    Description

    Proton-proton collision event generation is a key step in simulating high-energy physics experiments at particle colliders, such as the Large Hadron Collider (LHC) at CERN. Here's a brief overview of the process using the software tools you mentioned:

    MadGraph: This software package simulates high-energy particle collisions and generates event samples. It uses Feynman diagrams to calculate the cross sections for various collision processes and can generate events with user-specified cuts and kinematic constraints. MadGraph also has an interface to Pythia for showering and hadronization of the generated events.

    Pythia: This is a particle physics event generator designed for the simulation of high-energy collisions. It simulates the parton showering and hadronization processes that occur after the hard scatter, using various models and parameters. Pythia also includes a detailed simulation of the underlying event, which accounts for the multiple parton interactions that occur in a proton-proton collision.

    HEPMC2: This is a file format for storing Monte Carlo event samples in high-energy physics. It is used as an input/output format by many event generators and analysis tools, including Pythia and Delphes. HEPMC2 files contain information on the particles produced in a collision event, their kinematics, and the interactions that led to their production.

    Delphes: This is a software framework for simulating the response of a particle detector to high-energy collisions. It takes event samples generated by Pythia or other event generators and simulates the particle interactions in a detector, including effects such as energy deposition, tracking, and calorimetry. Delphes produces output files in a ROOT format that can be analyzed using ROOT.

    ROOT: This is a data analysis framework widely used in high-energy physics. It includes tools for manipulating and analyzing large datasets, including the simulation and reconstruction output files produced by event generators and detector simulation tools. ROOT also includes a powerful visualization tool for generating 2D and 3D plots of particle collisions and detector interactions.

    The proton-proton collision event generation process involves using MadGraph to generate hard scattering events, which are then passed to Pythia for parton showering and hadronization. The resulting events are stored in HEPMC2 format and then simulated in a detector using Delphes. The final output is a ROOT file that can be analyzed using various analysis tools within the ROOT framework.

    High-level overview of the steps involved in generating event data for high-energy physics experiments using Monte Carlo simulation:

    1. Specify the collision process: The first step is to specify the collision process that you want to simulate. This typically involves specifying the particles that will collide (e.g., protons, electrons, or other particles), the energy of the collision, and any relevant initial or final states.

    2. Generate hard scattering events: Once the collision process is specified, you can use a software package like MadGraph to generate hard scattering events. MadGraph uses Feynman diagrams to calculate the cross sections for various collision processes and generates events based on user-specified cuts and kinematic constraints.

    3. Apply parton showering and hadronization: After generating the hard scattering events, you can use a software package like Pythia to simulate the parton showering and hadronization processes that occur after the hard scatter. This involves simulating the fragmentation of the partons produced in the hard scattering event into hadrons and the subsequent showering of additional partons produced in the hadronization process.

    4. Simulate the detector response: Once you have generated a set of simulated events, you can use a software package like Delphes to simulate the response of the particle detector to the collisions. This involves simulating the interactions of particles with the detector material and the detector's response to the energy deposited by the particles.

    5. Analyze the data: Once you have generated and simulated the events, you can analyze the resulting data to extract information about the properties of the particles produced in the collision. This typically involves applying various cuts and selection criteria to the data and using statistical techniques to estimate the background and systematic uncertainties in the analysis.

    6. Compare with experimental data: Finally, you can compare the results of your Monte Carlo simulation with experimental data to test the accuracy of the simulation and to gain insights into the underlying physics of the collision process.

  2. PDF projections for the HL-LHC and the LHeC

    • zenodo.org
    application/gzip
    Updated Jan 24, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Rabah Abdul Khalek; Shaun Bailey; Jun Gao; Lucian Harland-Lang; Juan Rojo; Juan Rojo; Rabah Abdul Khalek; Shaun Bailey; Jun Gao; Lucian Harland-Lang (2020). PDF projections for the HL-LHC and the LHeC [Dataset]. http://doi.org/10.5281/zenodo.3250580
    Explore at:
    application/gzipAvailable download formats
    Dataset updated
    Jan 24, 2020
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Rabah Abdul Khalek; Shaun Bailey; Jun Gao; Lucian Harland-Lang; Juan Rojo; Juan Rojo; Rabah Abdul Khalek; Shaun Bailey; Jun Gao; Lucian Harland-Lang
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    For the foreseeable future, the exploration of the high-energy frontier will be the domain of the Large Hadron Collider (LHC). Of particular significance will be its high-luminosity upgrade (HL-LHC), which will operate until the mid- 2030s. In this endeavour, for the full exploitation of the HL-LHC physics potential an improved understanding of the parton distribution functions (PDFs) of the proton is critical.

    Since its start of data taking, the LHC has provided an impressive wealth of information on the quark and gluon structure of the proton. Indeed, modern global analyses of parton distribution functions (PDFs) include a wide range of LHC measurements of processes such as the production of jets, electroweak gauge bosons, and top quark pairs. Here we provide quantitative projections for global PDF fits that include the information expected from the High-Luminosity LHC (HL-LHC) in the LHAPDF format. These projections have been already used in particular in the HL-LHC CERN Yellow Reports.

    The HL-LHC program would be uniquely complemented by the proposed Large Hadron electron Collider (LHeC), a high-energy lepton-proton and lepton-nucleus collider based at CERN. Here we also present PDF projections based on the expected LHeC measurements of inclusive and heavy quark structure functions. These projections are presented both for the LHeC individually, and also in connection with the HL-LHC pseudo-data.

    The list of LHAPDF sets that is made available in this repository are the following:

    1. PDF4LHC15_nnlo_hllhc_scen1_lhec.tgz : Based on the PDF4LHC15 global fit supplemented by HL-LHC pseudo-data in Scenario 1 and also by the LHeC pseudo-data.
    2. PDF4LHC15_nnlo_hllhc_scen2_lhec.tgz : Based on the PDF4LHC15 global fit supplemented by HL-LHC pseudo-data in Scenario 2 and also by the LHeC pseudo-data.
    3. PDF4LHC15_nnlo_hllhc_scen3_lhec.tgz : Based on the PDF4LHC15 global fit supplemented by HL-LHC pseudo-data in Scenario 3 and also by the LHeC pseudo-data.
    4. PDF4LHC15_nnlo_lhec.tgz : Based on the PDF4LHC15 global fit supplemented by the LHeC pseudo-data.
    5. PDF4LHC15_nnlo_hllhc_scen1.tgz : Based on the PDF4LHC15 global fit supplemented by HL-LHC pseudo-data in Scenario 1.
    6. PDF4LHC15_nnlo_hllhc_scen2.tgz : Based on the PDF4LHC15 global fit supplemented by HL-LHC pseudo-data in Scenario 2.
    7. PDF4LHC15_nnlo_hllhc_scen3.tgz Based on the PDF4LHC15 global fit supplemented by HL-LHC pseudo-data in Scenario 3.
  3. w

    Data from: The Large Hadron Collider : unraveling the mysteries of the...

    • workwithdata.com
    Updated Apr 13, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Work With Data (2024). The Large Hadron Collider : unraveling the mysteries of the universe [Dataset]. https://www.workwithdata.com/object/the-large-hadron-collider-unraveling-mysteries-universe-book-by-martin-beech-1959
    Explore at:
    Dataset updated
    Apr 13, 2024
    Dataset authored and provided by
    Work With Data
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Explore The Large Hadron Collider : unraveling the mysteries of the universe through unique data from multiples sources: key facts, real-time news, interactive charts, detailed maps & open datasets

  4. New Physics Mining at the Large Hadron Collider: Z -> l l

    • zenodo.org
    application/gzip
    Updated Feb 19, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Olmo Cerri; Olmo Cerri; Thong Nguyen; Thong Nguyen; Maurizio Pierini; Maurizio Pierini; Jean-Roch Vlimant; Jean-Roch Vlimant (2020). New Physics Mining at the Large Hadron Collider: Z -> l l [Dataset]. http://doi.org/10.5281/zenodo.3675203
    Explore at:
    application/gzipAvailable download formats
    Dataset updated
    Feb 19, 2020
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Olmo Cerri; Olmo Cerri; Thong Nguyen; Thong Nguyen; Maurizio Pierini; Maurizio Pierini; Jean-Roch Vlimant; Jean-Roch Vlimant
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    \(Z \to \ell \ell\) background events reconstructed by inclusive single-muon selection.

    Events are represented as an array of physics-motivated high-level features.

    Details are given in https://arxiv.org/abs/1811.10276

  5. h

    Data from: Measurement of very forward neutron energy spectra for 7 TeV...

    • hepdata.net
    Updated 2016
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    HEPData (2016). Measurement of very forward neutron energy spectra for 7 TeV proton--proton collisions at the Large Hadron Collider [Dataset]. http://doi.org/10.17182/hepdata.73320
    Explore at:
    Dataset updated
    2016
    Dataset provided by
    HEPData
    Description

    CERN-LHC. The Large Hadron Collider forward (LHCf) experiment is designed to use the LHC to verify the hadronic-interaction models used in cosmic-ray physics. Forward baryon production is one of the crucial points to understand the development of cosmic-ray showers. We report the neutron-energy spectra for LHC $\sqrt{s}$ = 7 TeV proton-proton collisions with the pseudo-rapidity $\eta$ ranging from 8.81 to 8.99, from 8.99 to 9.22, and from 10.76 to infinity.

  6. f

    HL-LHC Wakefield Simulations: 7TeV STABLE conditions

    • figshare.com
    application/gzip
    Updated May 31, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Adrian Oeftiger (2023). HL-LHC Wakefield Simulations: 7TeV STABLE conditions [Dataset]. http://doi.org/10.6084/m9.figshare.7429400.v3
    Explore at:
    application/gzipAvailable download formats
    Dataset updated
    May 31, 2023
    Dataset provided by
    figshare
    Authors
    Adrian Oeftiger
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    PyHEADTAIL macro-particle simulations (http://github.com/PyCOMPLETE/PyHEADTAIL) of single bunch stability in the High Luminosity LHC at top energy lead to the present data. Head-tail instabilities driven by the machine impedance can seriously degrade the beam quality and even lead to machine protection issues.Here we simulate with smooth approximation optics and recent versions of the complete HL-LHC impedance model at flat-top conditions at an energy of 7TeV. The influence of the different wake terms (dipolar, dipolar + cross-terms, dipolar + cross-terms + quadrupolar), the longitudinal distribution (thermal and q=3/5-Gaussian) and the different impedance scenarios (without MoGr coated TCTs yet) are compared. For each case, we scan the linear chromaticity. The transverse feedback is taken into account with an ideal damper model at 50 turns. Longitudinal non-linear synchrotron motion is employed. Lattice non-linearities (such as detuning with amplitude from octupoles) are not considered. Here we consider STABLE conditions, i.e. the impedance model for fully squeezed beta*=15cm at IP1 and IP5.

  7. d

    LHC collider phenomenology of minimal universal extra dimensions - Dataset -...

    • b2find.dkrz.de
    Updated Feb 3, 2017
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2017). LHC collider phenomenology of minimal universal extra dimensions - Dataset - B2FIND [Dataset]. https://b2find.dkrz.de/dataset/4e945a31-dfcc-554f-8089-b14934ca1a01
    Explore at:
    Dataset updated
    Feb 3, 2017
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    We discuss the collider phenomenology of the model of Minimal Universal Extra Dimensions (MUED) at the Large hadron Collider (LHC). We derive analytical results for all relevant strong pair-production processes of two level 1 Kaluza–Klein partners and use them to validate and correct the existing MUED implementation in the fortran version of the Pythia event generator. We also develop a new implementation of the model in the C++ version of Pythia. We use our implementations in conjunction with the Checkmate package to derive the LHC bounds on MUED from a large number of published experimental analyses from Run 1 at the LHC. The previous version of this program (ACTU_v2_1) may be found at http://dx.doi.org/10.1016/j.cpc.2009.08.008.

  8. New Physics Mining at the Large Hadron Collider: W -> l nu

    • zenodo.org
    • explore.openaire.eu
    application/gzip
    Updated Feb 19, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Olmo Cerri; Olmo Cerri; Thong Nguyen; Thong Nguyen; Maurizio Pierini; Maurizio Pierini; Jean-Roch Vlimant; Jean-Roch Vlimant (2020). New Physics Mining at the Large Hadron Collider: W -> l nu [Dataset]. http://doi.org/10.5281/zenodo.3675199
    Explore at:
    application/gzipAvailable download formats
    Dataset updated
    Feb 19, 2020
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Olmo Cerri; Olmo Cerri; Thong Nguyen; Thong Nguyen; Maurizio Pierini; Maurizio Pierini; Jean-Roch Vlimant; Jean-Roch Vlimant
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    \(W \to \ell u\) background events reconstructed by inclusive single-muon selection.

    Events are represented as an array of physics-motivated high-level features.

    Details are given in https://arxiv.org/abs/1811.10276

  9. m

    Data from: LHC collider phenomenology of minimal universal extra dimensions

    • data.mendeley.com
    Updated Feb 28, 2018
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jyotiranjan Beuria (2018). LHC collider phenomenology of minimal universal extra dimensions [Dataset]. http://doi.org/10.17632/fkvcgdw9j6.1
    Explore at:
    Dataset updated
    Feb 28, 2018
    Authors
    Jyotiranjan Beuria
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    We discuss the collider phenomenology of the model of Minimal Universal Extra Dimensions (MUED) at the Large hadron Collider (LHC). We derive analytical results for all relevant strong pair-production processes of two level 1 Kaluza–Klein partners and use them to validate and correct the existing MUED implementation in the fortran version of the Pythia event generator. We also develop a new implementation of the model in the C++ version of Pythia. We use our implementations in conjunction with the Checkmate package to derive the LHC bounds on MUED from a large number of published experimental analyses from Run 1 at the LHC.

    The previous version of this program (ACTU_v2_1) may be found at http://dx.doi.org/10.1016/j.cpc.2009.08.008.

  10. Search For Higgs Boson Pair Production in the γ γ b b ¯ Final State using p...

    • osti.gov
    Updated Jun 1, 2014
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Aad, Georges; The ATLAS collaboration (2014). Search For Higgs Boson Pair Production in the γ γ b b ¯ Final State using p p Collision Data at s = 8 TeV from the ATLAS Detector [Dataset]. http://doi.org/10.17182/hepdata.64171
    Explore at:
    Dataset updated
    Jun 1, 2014
    Dataset provided by
    United States Department of Energyhttp://energy.gov/
    Office of Sciencehttp://www.er.doe.gov/
    Authors
    Aad, Georges; The ATLAS collaboration
    Description

    CERN-LHC. Searches are performed for resonant and non-resonant Higgs boson pair production in the hh to gamma gamma b bbar final state using 20/fb of proton--proton collisions at a center-of-mass energy of 8 TeV recorded with the ATLAS detector at the CERN Large Hadron Collider. A 95% confidence level upper limit on the cross section times branching ratio of non-resonant production is set at 2.2 pb, while the expected limit is 1.0 pb. The corresponding limit observed for a narrow resonance ranges between 0.7 and 3.5 pb as a function of its mass.

  11. d

    HEJ 2: High energy resummation for hadron colliders - Dataset - B2FIND

    • b2find.dkrz.de
    Updated Feb 25, 2019
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2019). HEJ 2: High energy resummation for hadron colliders - Dataset - B2FIND [Dataset]. https://b2find.dkrz.de/dataset/019c9ee7-67f2-5b20-b057-c2a20e1ae2c9
    Explore at:
    Dataset updated
    Feb 25, 2019
    Description

    We present HEJ 2, a new implementation of the High Energy Jets formalism for high-energy resummation in hadron-collider processes as a flexible Monte Carlo event generator. In combination with a conventional fixed-order event generator, HEJ 2 can be used to obtain greatly improved predictions for a number of phenomenologically important processes by adding all-order logarithmic corrections in s / p^2. A prime example for such a process is the gluon-fusion production of a Higgs boson in association with widely separated jets, which constitutes the dominant background to Higgs boson production in weak-boson fusion.

  12. h

    ATLAS Run 1 searches for direct pair production of third-generation squarks...

    • hepdata.net
    • osti.gov
    Updated Sep 30, 2015
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2015). ATLAS Run 1 searches for direct pair production of third-generation squarks at the Large Hadron Collider [Dataset]. http://doi.org/10.17182/hepdata.69366
    Explore at:
    Dataset updated
    Sep 30, 2015
    Description

    CERN-LHC. This paper reviews and extends searches for the direct pair production of the scalar supersymmetric partners of the top and bottom quarks in proton--proton collisions collected by the ATLAS collaboration during the LHC Run 1. Most of the analyses use 20 fb$^{-1}$ of collisions at a centre-of-mass energy of $\sqrt{s} = 8$ TeV, although in some case an additional 4.7 fb$^{-1}$ of collision data at $\sqrt{s}= 7$ TeV are used. New analyses are introduced to improve the sensitivity to specific regions of the model parameter space. Since no evidence of third-generation squarks is found, exclusion limits are derived by combining several analyses and are presented in both a simplified model framework, assuming simple decay chains, as well as within the context of more elaborate phenomenological supersymmetric models.

  13. EW-ino scan points from "SModelS v2.3: enabling global likelihood analyses"...

    • zenodo.org
    application/gzip
    Updated Aug 23, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Mohammad Altakach; Sabine Kraml; Sabine Kraml; Andre Lessa; Sahana Narasimha; Timothee Pascal; Wolfgang Waltenberger; Mohammad Altakach; Andre Lessa; Sahana Narasimha; Timothee Pascal; Wolfgang Waltenberger (2023). EW-ino scan points from "SModelS v2.3: enabling global likelihood analyses" paper [Dataset]. http://doi.org/10.5281/zenodo.8086950
    Explore at:
    application/gzipAvailable download formats
    Dataset updated
    Aug 23, 2023
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Mohammad Altakach; Sabine Kraml; Sabine Kraml; Andre Lessa; Sahana Narasimha; Timothee Pascal; Wolfgang Waltenberger; Mohammad Altakach; Andre Lessa; Sahana Narasimha; Timothee Pascal; Wolfgang Waltenberger
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Input SLHA and SModelS output (.smodels and .py) files from the paper "SModelS v2.3: enabling global likelihood analyses". The dataset comprises 18557 electroweak-ino scan points and can be used to reproduce all the plots presented in the paper.

    • ewino_slha.tar.gz : input SLHA files including mass spectra, decay tables and cross sections
    • ewino_smodels_v23_combSRs.tar.gz : SModelS v2.3 output with combineSRs=True and combineAnas = ATLAS-SUSY-2018-41,CMS-SUS-21-002 (primary v2.3 results used in section 4, Figs. 2-5)
    • ewino_smodels_v23_bestSR.tar.gz : SModelS v2.3 output with combineSRs=False and combineAnas = ATLAS-SUSY-2018-41,CMS-SUS-21-002 (used only in Fig. 2)
    • ewino_smodels_v21.tar.gz : SModelS v2.1 output with combineSRs=False (used only in Fig. 2)
  14. A

    ATLAS

    • data.amerigeoss.org
    • data.wu.ac.at
    Updated Jul 28, 2019
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    United States[old] (2019). ATLAS [Dataset]. https://data.amerigeoss.org/dataset/atlas-ae255
    Explore at:
    Dataset updated
    Jul 28, 2019
    Dataset provided by
    United States[old]
    Description

    ATLAS is a particle physics experiment at the Large Hadron Collider at CERN, the European Organization for Nuclear Research. Scientists from Brookhaven have played and continue to play key roles in the design, construction, and operation of ATLAS in its search for new discoveries about the particles and forces that shape our universe. Brookhaven is the host laboratory for U.S. collaborators on ATLAS, and our computing facility stores, processes, and distributes ATLAS data to collaborators at universities and laboratories throughout the nation. Some research at ATLAS is complementary to studies at RHIC, but a large portion of the collisions at the LHC are aimed at very different questions.

  15. f

    DataSheet1_Application of reinforcement learning in the LHC tune...

    • figshare.com
    pdf
    Updated Jun 11, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Leander Grech; Gianluca Valentino; Diogo Alves; Simon Hirlaender (2023). DataSheet1_Application of reinforcement learning in the LHC tune feedback.PDF [Dataset]. http://doi.org/10.3389/fphy.2022.929064.s001
    Explore at:
    pdfAvailable download formats
    Dataset updated
    Jun 11, 2023
    Dataset provided by
    Frontiers
    Authors
    Leander Grech; Gianluca Valentino; Diogo Alves; Simon Hirlaender
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    The Beam-Based Feedback System (BBFS) was primarily responsible for correcting the beam energy, orbit and tune in the CERN Large Hadron Collider (LHC). A major code renovation of the BBFS was planned and carried out during the LHC Long Shutdown 2 (LS2). This work consists of an explorative study to solve a beam-based control problem, the tune feedback (QFB), utilising state-of-the-art Reinforcement Learning (RL). A simulation environment was created to mimic the operation of the QFB. A series of RL agents were trained, and the best-performing agents were then subjected to a set of well-designed tests. The original feedback controller used in the QFB was reimplemented to compare the performance of the classical approach to the performance of selected RL agents in the test scenarios. Results from the simulated environment show that the RL agent performance can exceed the controller-based paradigm.

  16. o

    Measurement of long-range multiparticle azimuthal correlations with the...

    • explore.openaire.eu
    Updated Jan 1, 2018
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Morad Aaboud; Georges Aad; Brad Abbott; Ovsat Bahram Oglu Abdinov; Baptiste Abeloos; Syed Haider Abidi; Hass Abouzeid; Nadine L. Abraham; Halina Abramowicz; Henso Abreu; Trygve Buanes; Ørjan Dale; Gerald Eigen; Wolfgang Liebig; Anna Lipniacka; Bertrand Martin Dit Latour; Steffen Mæland; Bjarne Stugu; Zongchang Yang; Justas Zalieckas; Magnar Kopangen Bugge; David Gordon Cameron; James Richard Catmore; Simon Feigl; Laura Franconi; Vincent Garonne; Børge Kile Gjelsten; Eirik Gramstad; Vanja Morisbak; Jon Kerr Nilsen; Henrik Oppen; Farid Ould-Saada; Silje Hattrem Raddum; Alexander Lincoln Read; Ole Myren Røhne; Heidi Sandaker; Cédric Serfon; Steinar Stapnes; Knut Oddvar Høie Vadla; Rômulo F. Abreu; Yiming Abulaiti; Bobby S. Acharya; Shunsuke Adachi; Leszek Adamczyk; Jareed Adelman; Michael Adersberger; Tim Adye; Anthony Allen Affolder; Yoav Afik; Tatjana Agatonovic-Jovin; Collaboration Atlas (2018). Measurement of long-range multiparticle azimuthal correlations with the subevent cumulant method in pp and p +Pb collisions with the ATLAS detector at the CERN Large Hadron Collider [Dataset]. https://explore.openaire.eu/search/other?orpId=nora_uio_no::40211aa02026baf8d14276d301d691c2
    Explore at:
    Dataset updated
    Jan 1, 2018
    Authors
    Morad Aaboud; Georges Aad; Brad Abbott; Ovsat Bahram Oglu Abdinov; Baptiste Abeloos; Syed Haider Abidi; Hass Abouzeid; Nadine L. Abraham; Halina Abramowicz; Henso Abreu; Trygve Buanes; Ørjan Dale; Gerald Eigen; Wolfgang Liebig; Anna Lipniacka; Bertrand Martin Dit Latour; Steffen Mæland; Bjarne Stugu; Zongchang Yang; Justas Zalieckas; Magnar Kopangen Bugge; David Gordon Cameron; James Richard Catmore; Simon Feigl; Laura Franconi; Vincent Garonne; Børge Kile Gjelsten; Eirik Gramstad; Vanja Morisbak; Jon Kerr Nilsen; Henrik Oppen; Farid Ould-Saada; Silje Hattrem Raddum; Alexander Lincoln Read; Ole Myren Røhne; Heidi Sandaker; Cédric Serfon; Steinar Stapnes; Knut Oddvar Høie Vadla; Rômulo F. Abreu; Yiming Abulaiti; Bobby S. Acharya; Shunsuke Adachi; Leszek Adamczyk; Jareed Adelman; Michael Adersberger; Tim Adye; Anthony Allen Affolder; Yoav Afik; Tatjana Agatonovic-Jovin; Collaboration Atlas
    Description

    A detailed study of multiparticle azimuthal correlations is presented using ppdata at √s=5.02 and 13 TeV, and p+Pb data at √sNN=5.02 TeV, recorded with the ATLAS detector at the CERN Large Hadron Collider. The azimuthal correlations are probed using four-particle cumulants c_n{4} and flow coefficients v_n{4}=(−c_n{4})^1/4 for n=2 and 3, with the goal of extracting long-range multiparticle azimuthal correlation signals and suppressing the short-range correlations. The values of c_n{4} are obtained as a function of the average number of charged particles per event, ⟨N_ch⟩, using the recently proposed two-subevent and three-subevent cumulant methods, and compared with results obtained with the standard cumulant method. The standard method is found to be strongly biased by short-range correlations, which originate mostly from jets with a positive contribution to c_n{4}. The three-subevent method, on the other hand, is found to be least sensitive to short-range correlations. The three-subevent method gives a negative c_2{4}, and therefore a well-defined v_2{4}, nearly independent of ⟨N_ch⟩, which implies that the long-range multiparticle azimuthal correlations persist to events with low multiplicity. Furthermore, v_2{4} is found to be smaller than the v_2{2} measured using the two-particle correlation method, as expected for long-range collective behavior. Finally, the measured values of v_2{4} and v_2{2} are used to estimate the number of sources relevant for the initial eccentricity in the collision geometry. The results based on the subevent cumulant technique provide direct evidence, in small collision systems, for a long-range collectivity involving many particles distributed across a broad rapidity interval.

  17. o

    Data from: Charged hadron fragmentation functions from collider data: NNPDF...

    • omicsdi.org
    xml
    Updated Oct 27, 2011
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Bertone V (2011). Charged hadron fragmentation functions from collider data: NNPDF Collaboration. [Dataset]. https://www.omicsdi.org/dataset/biostudies-other/S-EPMC6113698
    Explore at:
    xmlAvailable download formats
    Dataset updated
    Oct 27, 2011
    Authors
    Bertone V
    Variables measured
    Unknown
    Description

    We present NNFF1.1h, a new determination of unidentified charged-hadron fragmentation functions (FFs) and their uncertainties. Experimental measurements of transverse-momentum distributions for charged-hadron production in proton-(anti)proton collisions at the Tevatron and at the LHC are used to constrain a set of FFs originally determined from electron-positron annihilation data. Our analysis is performed at next-to-leading order in perturbative quantum chromodynamics. We find that the hadron-collider data is consistent with the electron-positron data and that it significantly constrains the gluon FF. We verify the reliability of our results upon our choice of the kinematic cut in the hadron transverse momentum applied to the hadron-collider data and their consistency with NNFF1.0, our previous determination of the FFs of charged pions, kaons, and protons/antiprotons.

  18. m

    Data from: HEJ 2: High energy resummation for hadron colliders

    • data.mendeley.com
    Updated Aug 29, 2019
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jeppe R. Andersen (2019). HEJ 2: High energy resummation for hadron colliders [Dataset]. http://doi.org/10.17632/24c9mb6pf9.1
    Explore at:
    Dataset updated
    Aug 29, 2019
    Authors
    Jeppe R. Andersen
    License

    https://www.gnu.org/licenses/old-licenses/gpl-2.0.en.htmlhttps://www.gnu.org/licenses/old-licenses/gpl-2.0.en.html

    Description

    We present HEJ 2, a new implementation of the High Energy Jets formalism for high-energy resummation in hadron-collider processes as a flexible Monte Carlo event generator. In combination with a conventional fixed-order event generator, HEJ 2 can be used to obtain greatly improved predictions for a number of phenomenologically important processes by adding all-order logarithmic corrections in s / p^2. A prime example for such a process is the gluon-fusion production of a Higgs boson in association with widely separated jets, which constitutes the dominant background to Higgs boson production in weak-boson fusion.

  19. m

    Data from: SHARE with CHARM

    • data.mendeley.com
    Updated Jan 1, 2014
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    M. Petran (2014). SHARE with CHARM [Dataset]. http://doi.org/10.17632/887xv8yfy7.1
    Explore at:
    Dataset updated
    Jan 1, 2014
    Authors
    M. Petran
    License

    https://www.elsevier.com/about/policies/open-access-licenses/elsevier-user-license/cpc-license/https://www.elsevier.com/about/policies/open-access-licenses/elsevier-user-license/cpc-license/

    Description

    Abstract SHARE with CHARM program (SHAREv3) implements the statistical hadronization model description of particle production in relativistic heavy-ion collisions. Given a set of statistical parameters, SHAREv3 program evaluates yields and therefore also ratios, and furthermore, statistical particle abundance fluctuations. The physical bulk properties of the particle source are evaluated based on all hadrons produced, including the fitted yields. The bulk properties can be prescribed as a fit input co...

    Title of program: SHARE with CHARM Catalogue Id: ADVD_v3_0

    Nature of problem The Understanding of hadron production incorporating the four u, d, s, c quark flavors is essential for the understanding of the properties of quark-gluon plasma created in relativistic heavy-ion collisions in the large-hadron collider (LHC) energy domain. We describe hadron production by a hot fireball within the statistical hadronization model (SHM) allowing for the chemical nonequilibrium of all quark flavors individually. By fitting particle abundances subject to bulk property constraints in ...

    Versions of this program held in the CPC repository in Mendeley Data ADVD_v1_0; SHARE, October 2004, version 1.2; 10.1016/j.cpc.2005.01.004 ADVD_v2_0; SHAREv2; 10.1016/j.cpc.2006.07.010 ADVD_v3_0; SHARE with CHARM; 10.1016/j.cpc.2014.02.026

    This program has been imported from the CPC Program Library held at Queen's University Belfast (1969-2018)

  20. c

    ATLAS Top Tagging Open Data Set

    • opendata.cern.ch
    Updated 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    ATLAS collaboration (2022). ATLAS Top Tagging Open Data Set [Dataset]. http://doi.org/10.7483/OPENDATA.ATLAS.FG5F.96GA
    Explore at:
    Dataset updated
    2022
    Dataset provided by
    CERN Open Data Portal
    Authors
    ATLAS collaboration
    Description

    Boosted top tagging is an essential binary classification task for experiments at the Large Hadron Collider (LHC) to measure the properties of the top quark. The ATLAS Top Tagging Open Data Set is a publicly available data set for the development of Machine Learning (ML) based boosted top tagging algorithms. The data are split into two orthogonal sets, named train and test and stored in the HDF5 file format, containing 42 million and 2.5 million jets respectively. Both sets are composed of equal parts signal (jets initiated by a boosted top quark) and background (jets initiated by light quarks or gluons). For each jet, the data set contains:

    • The four vectors of constituent particles
    • 15 high level summary quantities evaluated on the jet
    • The four vector of the whole jet
    • A training weight
    • A signal (1) vs background (0) label.

    There is one rule in using this data set: the contribution to a loss function from any jet should always be weighted by the training weight. Apart from this a model should separate the signal jets from background by whatever means necessary.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
(2022). Proton-Proton-Collision-process-at-LHC-simulations [Dataset]. https://www.kaggle.com/datasets/shirshmall/lhc-events-ppee-ppmumu

Proton-Proton-Collision-process-at-LHC-simulations

Proton-Proton Collision process at the Large Hadron Collider (Simulations)

Explore at:
CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
Dataset updated
Sep 15, 2022
Description

Proton-proton collision event generation is a key step in simulating high-energy physics experiments at particle colliders, such as the Large Hadron Collider (LHC) at CERN. Here's a brief overview of the process using the software tools you mentioned:

MadGraph: This software package simulates high-energy particle collisions and generates event samples. It uses Feynman diagrams to calculate the cross sections for various collision processes and can generate events with user-specified cuts and kinematic constraints. MadGraph also has an interface to Pythia for showering and hadronization of the generated events.

Pythia: This is a particle physics event generator designed for the simulation of high-energy collisions. It simulates the parton showering and hadronization processes that occur after the hard scatter, using various models and parameters. Pythia also includes a detailed simulation of the underlying event, which accounts for the multiple parton interactions that occur in a proton-proton collision.

HEPMC2: This is a file format for storing Monte Carlo event samples in high-energy physics. It is used as an input/output format by many event generators and analysis tools, including Pythia and Delphes. HEPMC2 files contain information on the particles produced in a collision event, their kinematics, and the interactions that led to their production.

Delphes: This is a software framework for simulating the response of a particle detector to high-energy collisions. It takes event samples generated by Pythia or other event generators and simulates the particle interactions in a detector, including effects such as energy deposition, tracking, and calorimetry. Delphes produces output files in a ROOT format that can be analyzed using ROOT.

ROOT: This is a data analysis framework widely used in high-energy physics. It includes tools for manipulating and analyzing large datasets, including the simulation and reconstruction output files produced by event generators and detector simulation tools. ROOT also includes a powerful visualization tool for generating 2D and 3D plots of particle collisions and detector interactions.

The proton-proton collision event generation process involves using MadGraph to generate hard scattering events, which are then passed to Pythia for parton showering and hadronization. The resulting events are stored in HEPMC2 format and then simulated in a detector using Delphes. The final output is a ROOT file that can be analyzed using various analysis tools within the ROOT framework.

High-level overview of the steps involved in generating event data for high-energy physics experiments using Monte Carlo simulation:

  1. Specify the collision process: The first step is to specify the collision process that you want to simulate. This typically involves specifying the particles that will collide (e.g., protons, electrons, or other particles), the energy of the collision, and any relevant initial or final states.

  2. Generate hard scattering events: Once the collision process is specified, you can use a software package like MadGraph to generate hard scattering events. MadGraph uses Feynman diagrams to calculate the cross sections for various collision processes and generates events based on user-specified cuts and kinematic constraints.

  3. Apply parton showering and hadronization: After generating the hard scattering events, you can use a software package like Pythia to simulate the parton showering and hadronization processes that occur after the hard scatter. This involves simulating the fragmentation of the partons produced in the hard scattering event into hadrons and the subsequent showering of additional partons produced in the hadronization process.

  4. Simulate the detector response: Once you have generated a set of simulated events, you can use a software package like Delphes to simulate the response of the particle detector to the collisions. This involves simulating the interactions of particles with the detector material and the detector's response to the energy deposited by the particles.

  5. Analyze the data: Once you have generated and simulated the events, you can analyze the resulting data to extract information about the properties of the particles produced in the collision. This typically involves applying various cuts and selection criteria to the data and using statistical techniques to estimate the background and systematic uncertainties in the analysis.

  6. Compare with experimental data: Finally, you can compare the results of your Monte Carlo simulation with experimental data to test the accuracy of the simulation and to gain insights into the underlying physics of the collision process.

Search
Clear search
Close search
Google apps
Main menu