Proton-proton collision event generation is a key step in simulating high-energy physics experiments at particle colliders, such as the Large Hadron Collider (LHC) at CERN. Here's a brief overview of the process using the software tools you mentioned:
MadGraph: This software package simulates high-energy particle collisions and generates event samples. It uses Feynman diagrams to calculate the cross sections for various collision processes and can generate events with user-specified cuts and kinematic constraints. MadGraph also has an interface to Pythia for showering and hadronization of the generated events.
Pythia: This is a particle physics event generator designed for the simulation of high-energy collisions. It simulates the parton showering and hadronization processes that occur after the hard scatter, using various models and parameters. Pythia also includes a detailed simulation of the underlying event, which accounts for the multiple parton interactions that occur in a proton-proton collision.
HEPMC2: This is a file format for storing Monte Carlo event samples in high-energy physics. It is used as an input/output format by many event generators and analysis tools, including Pythia and Delphes. HEPMC2 files contain information on the particles produced in a collision event, their kinematics, and the interactions that led to their production.
Delphes: This is a software framework for simulating the response of a particle detector to high-energy collisions. It takes event samples generated by Pythia or other event generators and simulates the particle interactions in a detector, including effects such as energy deposition, tracking, and calorimetry. Delphes produces output files in a ROOT format that can be analyzed using ROOT.
ROOT: This is a data analysis framework widely used in high-energy physics. It includes tools for manipulating and analyzing large datasets, including the simulation and reconstruction output files produced by event generators and detector simulation tools. ROOT also includes a powerful visualization tool for generating 2D and 3D plots of particle collisions and detector interactions.
The proton-proton collision event generation process involves using MadGraph to generate hard scattering events, which are then passed to Pythia for parton showering and hadronization. The resulting events are stored in HEPMC2 format and then simulated in a detector using Delphes. The final output is a ROOT file that can be analyzed using various analysis tools within the ROOT framework.
High-level overview of the steps involved in generating event data for high-energy physics experiments using Monte Carlo simulation:
Specify the collision process: The first step is to specify the collision process that you want to simulate. This typically involves specifying the particles that will collide (e.g., protons, electrons, or other particles), the energy of the collision, and any relevant initial or final states.
Generate hard scattering events: Once the collision process is specified, you can use a software package like MadGraph to generate hard scattering events. MadGraph uses Feynman diagrams to calculate the cross sections for various collision processes and generates events based on user-specified cuts and kinematic constraints.
Apply parton showering and hadronization: After generating the hard scattering events, you can use a software package like Pythia to simulate the parton showering and hadronization processes that occur after the hard scatter. This involves simulating the fragmentation of the partons produced in the hard scattering event into hadrons and the subsequent showering of additional partons produced in the hadronization process.
Simulate the detector response: Once you have generated a set of simulated events, you can use a software package like Delphes to simulate the response of the particle detector to the collisions. This involves simulating the interactions of particles with the detector material and the detector's response to the energy deposited by the particles.
Analyze the data: Once you have generated and simulated the events, you can analyze the resulting data to extract information about the properties of the particles produced in the collision. This typically involves applying various cuts and selection criteria to the data and using statistical techniques to estimate the background and systematic uncertainties in the analysis.
Compare with experimental data: Finally, you can compare the results of your Monte Carlo simulation with experimental data to test the accuracy of the simulation and to gain insights into the underlying physics of the collision process.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
For the foreseeable future, the exploration of the high-energy frontier will be the domain of the Large Hadron Collider (LHC). Of particular significance will be its high-luminosity upgrade (HL-LHC), which will operate until the mid- 2030s. In this endeavour, for the full exploitation of the HL-LHC physics potential an improved understanding of the parton distribution functions (PDFs) of the proton is critical.
Since its start of data taking, the LHC has provided an impressive wealth of information on the quark and gluon structure of the proton. Indeed, modern global analyses of parton distribution functions (PDFs) include a wide range of LHC measurements of processes such as the production of jets, electroweak gauge bosons, and top quark pairs. Here we provide quantitative projections for global PDF fits that include the information expected from the High-Luminosity LHC (HL-LHC) in the LHAPDF format. These projections have been already used in particular in the HL-LHC CERN Yellow Reports.
The HL-LHC program would be uniquely complemented by the proposed Large Hadron electron Collider (LHeC), a high-energy lepton-proton and lepton-nucleus collider based at CERN. Here we also present PDF projections based on the expected LHeC measurements of inclusive and heavy quark structure functions. These projections are presented both for the LHeC individually, and also in connection with the HL-LHC pseudo-data.
The list of LHAPDF sets that is made available in this repository are the following:
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Explore The Large Hadron Collider : unraveling the mysteries of the universe through unique data from multiples sources: key facts, real-time news, interactive charts, detailed maps & open datasets
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
\(Z \to \ell \ell\) background events reconstructed by inclusive single-muon selection.
Events are represented as an array of physics-motivated high-level features.
Details are given in https://arxiv.org/abs/1811.10276
CERN-LHC. The Large Hadron Collider forward (LHCf) experiment is designed to use the LHC to verify the hadronic-interaction models used in cosmic-ray physics. Forward baryon production is one of the crucial points to understand the development of cosmic-ray showers. We report the neutron-energy spectra for LHC $\sqrt{s}$ = 7 TeV proton-proton collisions with the pseudo-rapidity $\eta$ ranging from 8.81 to 8.99, from 8.99 to 9.22, and from 10.76 to infinity.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
PyHEADTAIL macro-particle simulations (http://github.com/PyCOMPLETE/PyHEADTAIL) of single bunch stability in the High Luminosity LHC at top energy lead to the present data. Head-tail instabilities driven by the machine impedance can seriously degrade the beam quality and even lead to machine protection issues.Here we simulate with smooth approximation optics and recent versions of the complete HL-LHC impedance model at flat-top conditions at an energy of 7TeV. The influence of the different wake terms (dipolar, dipolar + cross-terms, dipolar + cross-terms + quadrupolar), the longitudinal distribution (thermal and q=3/5-Gaussian) and the different impedance scenarios (without MoGr coated TCTs yet) are compared. For each case, we scan the linear chromaticity. The transverse feedback is taken into account with an ideal damper model at 50 turns. Longitudinal non-linear synchrotron motion is employed. Lattice non-linearities (such as detuning with amplitude from octupoles) are not considered. Here we consider STABLE conditions, i.e. the impedance model for fully squeezed beta*=15cm at IP1 and IP5.
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
We discuss the collider phenomenology of the model of Minimal Universal Extra Dimensions (MUED) at the Large hadron Collider (LHC). We derive analytical results for all relevant strong pair-production processes of two level 1 Kaluza–Klein partners and use them to validate and correct the existing MUED implementation in the fortran version of the Pythia event generator. We also develop a new implementation of the model in the C++ version of Pythia. We use our implementations in conjunction with the Checkmate package to derive the LHC bounds on MUED from a large number of published experimental analyses from Run 1 at the LHC. The previous version of this program (ACTU_v2_1) may be found at http://dx.doi.org/10.1016/j.cpc.2009.08.008.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
\(W \to \ell u\) background events reconstructed by inclusive single-muon selection.
Events are represented as an array of physics-motivated high-level features.
Details are given in https://arxiv.org/abs/1811.10276
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
We discuss the collider phenomenology of the model of Minimal Universal Extra Dimensions (MUED) at the Large hadron Collider (LHC). We derive analytical results for all relevant strong pair-production processes of two level 1 Kaluza–Klein partners and use them to validate and correct the existing MUED implementation in the fortran version of the Pythia event generator. We also develop a new implementation of the model in the C++ version of Pythia. We use our implementations in conjunction with the Checkmate package to derive the LHC bounds on MUED from a large number of published experimental analyses from Run 1 at the LHC.
The previous version of this program (ACTU_v2_1) may be found at http://dx.doi.org/10.1016/j.cpc.2009.08.008.
CERN-LHC. Searches are performed for resonant and non-resonant Higgs boson pair production in the hh to gamma gamma b bbar final state using 20/fb of proton--proton collisions at a center-of-mass energy of 8 TeV recorded with the ATLAS detector at the CERN Large Hadron Collider. A 95% confidence level upper limit on the cross section times branching ratio of non-resonant production is set at 2.2 pb, while the expected limit is 1.0 pb. The corresponding limit observed for a narrow resonance ranges between 0.7 and 3.5 pb as a function of its mass.
We present HEJ 2, a new implementation of the High Energy Jets formalism for high-energy resummation in hadron-collider processes as a flexible Monte Carlo event generator. In combination with a conventional fixed-order event generator, HEJ 2 can be used to obtain greatly improved predictions for a number of phenomenologically important processes by adding all-order logarithmic corrections in s / p^2. A prime example for such a process is the gluon-fusion production of a Higgs boson in association with widely separated jets, which constitutes the dominant background to Higgs boson production in weak-boson fusion.
CERN-LHC. This paper reviews and extends searches for the direct pair production of the scalar supersymmetric partners of the top and bottom quarks in proton--proton collisions collected by the ATLAS collaboration during the LHC Run 1. Most of the analyses use 20 fb$^{-1}$ of collisions at a centre-of-mass energy of $\sqrt{s} = 8$ TeV, although in some case an additional 4.7 fb$^{-1}$ of collision data at $\sqrt{s}= 7$ TeV are used. New analyses are introduced to improve the sensitivity to specific regions of the model parameter space. Since no evidence of third-generation squarks is found, exclusion limits are derived by combining several analyses and are presented in both a simplified model framework, assuming simple decay chains, as well as within the context of more elaborate phenomenological supersymmetric models.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Input SLHA and SModelS output (.smodels and .py) files from the paper "SModelS v2.3: enabling global likelihood analyses". The dataset comprises 18557 electroweak-ino scan points and can be used to reproduce all the plots presented in the paper.
ATLAS is a particle physics experiment at the Large Hadron Collider at CERN, the European Organization for Nuclear Research. Scientists from Brookhaven have played and continue to play key roles in the design, construction, and operation of ATLAS in its search for new discoveries about the particles and forces that shape our universe. Brookhaven is the host laboratory for U.S. collaborators on ATLAS, and our computing facility stores, processes, and distributes ATLAS data to collaborators at universities and laboratories throughout the nation. Some research at ATLAS is complementary to studies at RHIC, but a large portion of the collisions at the LHC are aimed at very different questions.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The Beam-Based Feedback System (BBFS) was primarily responsible for correcting the beam energy, orbit and tune in the CERN Large Hadron Collider (LHC). A major code renovation of the BBFS was planned and carried out during the LHC Long Shutdown 2 (LS2). This work consists of an explorative study to solve a beam-based control problem, the tune feedback (QFB), utilising state-of-the-art Reinforcement Learning (RL). A simulation environment was created to mimic the operation of the QFB. A series of RL agents were trained, and the best-performing agents were then subjected to a set of well-designed tests. The original feedback controller used in the QFB was reimplemented to compare the performance of the classical approach to the performance of selected RL agents in the test scenarios. Results from the simulated environment show that the RL agent performance can exceed the controller-based paradigm.
A detailed study of multiparticle azimuthal correlations is presented using ppdata at √s=5.02 and 13 TeV, and p+Pb data at √sNN=5.02 TeV, recorded with the ATLAS detector at the CERN Large Hadron Collider. The azimuthal correlations are probed using four-particle cumulants c_n{4} and flow coefficients v_n{4}=(−c_n{4})^1/4 for n=2 and 3, with the goal of extracting long-range multiparticle azimuthal correlation signals and suppressing the short-range correlations. The values of c_n{4} are obtained as a function of the average number of charged particles per event, ⟨N_ch⟩, using the recently proposed two-subevent and three-subevent cumulant methods, and compared with results obtained with the standard cumulant method. The standard method is found to be strongly biased by short-range correlations, which originate mostly from jets with a positive contribution to c_n{4}. The three-subevent method, on the other hand, is found to be least sensitive to short-range correlations. The three-subevent method gives a negative c_2{4}, and therefore a well-defined v_2{4}, nearly independent of ⟨N_ch⟩, which implies that the long-range multiparticle azimuthal correlations persist to events with low multiplicity. Furthermore, v_2{4} is found to be smaller than the v_2{2} measured using the two-particle correlation method, as expected for long-range collective behavior. Finally, the measured values of v_2{4} and v_2{2} are used to estimate the number of sources relevant for the initial eccentricity in the collision geometry. The results based on the subevent cumulant technique provide direct evidence, in small collision systems, for a long-range collectivity involving many particles distributed across a broad rapidity interval.
We present NNFF1.1h, a new determination of unidentified charged-hadron fragmentation functions (FFs) and their uncertainties. Experimental measurements of transverse-momentum distributions for charged-hadron production in proton-(anti)proton collisions at the Tevatron and at the LHC are used to constrain a set of FFs originally determined from electron-positron annihilation data. Our analysis is performed at next-to-leading order in perturbative quantum chromodynamics. We find that the hadron-collider data is consistent with the electron-positron data and that it significantly constrains the gluon FF. We verify the reliability of our results upon our choice of the kinematic cut in the hadron transverse momentum applied to the hadron-collider data and their consistency with NNFF1.0, our previous determination of the FFs of charged pions, kaons, and protons/antiprotons.
https://www.gnu.org/licenses/old-licenses/gpl-2.0.en.htmlhttps://www.gnu.org/licenses/old-licenses/gpl-2.0.en.html
We present HEJ 2, a new implementation of the High Energy Jets formalism for high-energy resummation in hadron-collider processes as a flexible Monte Carlo event generator. In combination with a conventional fixed-order event generator, HEJ 2 can be used to obtain greatly improved predictions for a number of phenomenologically important processes by adding all-order logarithmic corrections in s / p^2. A prime example for such a process is the gluon-fusion production of a Higgs boson in association with widely separated jets, which constitutes the dominant background to Higgs boson production in weak-boson fusion.
https://www.elsevier.com/about/policies/open-access-licenses/elsevier-user-license/cpc-license/https://www.elsevier.com/about/policies/open-access-licenses/elsevier-user-license/cpc-license/
Abstract SHARE with CHARM program (SHAREv3) implements the statistical hadronization model description of particle production in relativistic heavy-ion collisions. Given a set of statistical parameters, SHAREv3 program evaluates yields and therefore also ratios, and furthermore, statistical particle abundance fluctuations. The physical bulk properties of the particle source are evaluated based on all hadrons produced, including the fitted yields. The bulk properties can be prescribed as a fit input co...
Title of program: SHARE with CHARM Catalogue Id: ADVD_v3_0
Nature of problem The Understanding of hadron production incorporating the four u, d, s, c quark flavors is essential for the understanding of the properties of quark-gluon plasma created in relativistic heavy-ion collisions in the large-hadron collider (LHC) energy domain. We describe hadron production by a hot fireball within the statistical hadronization model (SHM) allowing for the chemical nonequilibrium of all quark flavors individually. By fitting particle abundances subject to bulk property constraints in ...
Versions of this program held in the CPC repository in Mendeley Data ADVD_v1_0; SHARE, October 2004, version 1.2; 10.1016/j.cpc.2005.01.004 ADVD_v2_0; SHAREv2; 10.1016/j.cpc.2006.07.010 ADVD_v3_0; SHARE with CHARM; 10.1016/j.cpc.2014.02.026
This program has been imported from the CPC Program Library held at Queen's University Belfast (1969-2018)
Boosted top tagging is an essential binary classification task for experiments at the Large Hadron Collider (LHC) to measure the properties of the top quark. The ATLAS Top Tagging Open Data Set is a publicly available data set for the development of Machine Learning (ML) based boosted top tagging algorithms. The data are split into two orthogonal sets, named train and test and stored in the HDF5 file format, containing 42 million and 2.5 million jets respectively. Both sets are composed of equal parts signal (jets initiated by a boosted top quark) and background (jets initiated by light quarks or gluons). For each jet, the data set contains:
There is one rule in using this data set: the contribution to a loss function from any jet should always be weighted by the training weight. Apart from this a model should separate the signal jets from background by whatever means necessary.
Proton-proton collision event generation is a key step in simulating high-energy physics experiments at particle colliders, such as the Large Hadron Collider (LHC) at CERN. Here's a brief overview of the process using the software tools you mentioned:
MadGraph: This software package simulates high-energy particle collisions and generates event samples. It uses Feynman diagrams to calculate the cross sections for various collision processes and can generate events with user-specified cuts and kinematic constraints. MadGraph also has an interface to Pythia for showering and hadronization of the generated events.
Pythia: This is a particle physics event generator designed for the simulation of high-energy collisions. It simulates the parton showering and hadronization processes that occur after the hard scatter, using various models and parameters. Pythia also includes a detailed simulation of the underlying event, which accounts for the multiple parton interactions that occur in a proton-proton collision.
HEPMC2: This is a file format for storing Monte Carlo event samples in high-energy physics. It is used as an input/output format by many event generators and analysis tools, including Pythia and Delphes. HEPMC2 files contain information on the particles produced in a collision event, their kinematics, and the interactions that led to their production.
Delphes: This is a software framework for simulating the response of a particle detector to high-energy collisions. It takes event samples generated by Pythia or other event generators and simulates the particle interactions in a detector, including effects such as energy deposition, tracking, and calorimetry. Delphes produces output files in a ROOT format that can be analyzed using ROOT.
ROOT: This is a data analysis framework widely used in high-energy physics. It includes tools for manipulating and analyzing large datasets, including the simulation and reconstruction output files produced by event generators and detector simulation tools. ROOT also includes a powerful visualization tool for generating 2D and 3D plots of particle collisions and detector interactions.
The proton-proton collision event generation process involves using MadGraph to generate hard scattering events, which are then passed to Pythia for parton showering and hadronization. The resulting events are stored in HEPMC2 format and then simulated in a detector using Delphes. The final output is a ROOT file that can be analyzed using various analysis tools within the ROOT framework.
High-level overview of the steps involved in generating event data for high-energy physics experiments using Monte Carlo simulation:
Specify the collision process: The first step is to specify the collision process that you want to simulate. This typically involves specifying the particles that will collide (e.g., protons, electrons, or other particles), the energy of the collision, and any relevant initial or final states.
Generate hard scattering events: Once the collision process is specified, you can use a software package like MadGraph to generate hard scattering events. MadGraph uses Feynman diagrams to calculate the cross sections for various collision processes and generates events based on user-specified cuts and kinematic constraints.
Apply parton showering and hadronization: After generating the hard scattering events, you can use a software package like Pythia to simulate the parton showering and hadronization processes that occur after the hard scatter. This involves simulating the fragmentation of the partons produced in the hard scattering event into hadrons and the subsequent showering of additional partons produced in the hadronization process.
Simulate the detector response: Once you have generated a set of simulated events, you can use a software package like Delphes to simulate the response of the particle detector to the collisions. This involves simulating the interactions of particles with the detector material and the detector's response to the energy deposited by the particles.
Analyze the data: Once you have generated and simulated the events, you can analyze the resulting data to extract information about the properties of the particles produced in the collision. This typically involves applying various cuts and selection criteria to the data and using statistical techniques to estimate the background and systematic uncertainties in the analysis.
Compare with experimental data: Finally, you can compare the results of your Monte Carlo simulation with experimental data to test the accuracy of the simulation and to gain insights into the underlying physics of the collision process.