#HighLuminosityLHC
Explore tagged Tumblr posts
Text
QVAE Quantum Variational Autoencoders LHC With Quantum AI

QVAE Quantum Variational Autoencoders
In order to address processing constraints for CERN's Large Hadron Collider (LHC) upgrades, TRIUMF, Perimeter Institute for Theoretical Physics, and D-Wave Quantum Inc. have collaborated on quantum-AI particle physics modelling. This revolutionary study in npj Quantum Information is the first to apply a quantum annealing device for the computationally expensive particle shower simulation at the LHC.
The Challenge: Computational Bottlenecks at the LHC
LHC Computational Bottlenecks The LHC collides protons to detect particles like the Higgs Boson. Upgraded to “High-Luminosity LHC” (HL-LHC), collisions will increase tenfold. This enhancement will improve measurements and discover rare processes, but it will cause computing challenges.
Simulations of collisions are needed to design experiments, calibrate detectors, check data compliance with physical assumptions, and analyse experimental data. These simulations often use first-principles particle simulation programs like GEANT4. However, GEANT4 takes 1000 CPU seconds to simulate a single event, and throughout the HL-LHC phase, this computational intensity is predicted to rise to millions of CPU-years per year, which is “financially and environmentally unsustainable”.
Simulation of particle-calorimeter interactions accounts for much of this processing effort. Calorimeters measure particle energy using particle showers from the detector's active material. Simulating these complex particle showers is the most computationally demanding Monte Carlo (MC) modelling job, but it is necessary for accurate observations.
The 2022 'CaloChallenge' provided datasets for organisations to construct and compare calorimeter simulations to advance this discipline. Note that this collaboration's research team is the only one to fully address this subject from a quantum standpoint.
Hybrid Quantum-AI Solution: CaloQVAE and Calo4pQVAE
To solve these problems, the researchers developed CaloQVAE, a quantum-AI hybrid that was eventually improved to Calo4pQVAE. Quantum annealing and generative model advances help simulate high-energy particle-calorimeter interactions rapidly and effectively.
In essence, Calo4pQVAE is a variational autoencoder (VAE) with a limited Boltzmann machine prior. VAEs are latent variable generative models that maximise an evidence lower limit to approach true log-likelihood. As a universal discrete distribution approximator, the RBM enhances model expressivity. Based on incident energy, the model generates artificial showers.
Using fully connected neural networks, the encoder (qϕ(z|x,e)) and decoder (pθ(x|z,e)) components of the VAE are modelled based on incident particle energy. Calo4pQVAE uses 3D convolutional layers and periodic boundary conditions for showers' cylindrical geometry. A discrete binary latent space and Boltzmann prior distribution are used.
The addition of D-Wave's annealing quantum computing technology is significant. Researchers used the D-Wave 2000Q annealer to produce CaloQVAE latent space samples. To adapt the RBM to the non-connected QPU architecture (Chimaera graph topology), a masking function was created. Calo4pQVAE's four-partite graph replaced the RBM's two-partite graph to use D-Wave's more advanced Pegasus-structured Advantage quantum annealer for sampling.
The scientists found that D-Wave's annealing quantum computers could simulate by unconventionally manipulating qubits. They “hijacked” a D-Wave quantum processor mechanism that maintains a qubit's bias-to-weight ratio. Fixing a subset of qubits (σz(k)) can condition the processor and maintain preset states during annealing. The device can produce showers with desirable features like impinging particle energy.
This conditioning uses the flux bias parameters of the quantum annealer, allowing flexible integration of classical RBM capabilities with quantum annealing's speedup and scalability. The work also proposes an adaptive method for determining the quantum annealer's effective inverse temperature, a discovery that could benefit quantum machine learning applications.
Performance, Benefits
The findings show this quantum-AI hybrid approach's promising performance on several metrics:
Quantum Processing Unit (QPU) annealing time per sample is 20 µs, 20 times faster than GPU-generated samples. The core annealing speed suggests that optimised engineering can beat classical methods, despite the somewhat faster total quantum sampler rate (0.4 ms per sample) compared to classical GPU approaches (~0.5 ms per sample). Conventional methods took 1 second to generate 1024 samples, while QA took 0.1 seconds (assuming single QPU programming).
Synthetic data from the CaloQVAE model matches major patterns in real data. The accuracy measures for particle categorisation, such as e+ vs. π+, are comparable to CaloGAN and other approaches. GEANT4 data and generative models match qualitatively for shower shape variables, demonstrating the models capture significant traits and relationships. Modern Monte Carlo methods compare to the quantum device's sample quality. Both classical (DVAE) and quantum (QVAE) approaches replicated real GEANT4 data for model energy conditioning. This framework outperforms over half of the CaloChallenge models based on FPD and KPD.
A key factor is energy consumption and computational efficiency. Unlike classical GPUs, D-Wave quantum computers use the same energy regardless of job size. This shows that QPUs could develop without greater computing power, making high-demand simulations possible.
Institutional Collaboration and Future Implications
This crucial work was conducted by TRIUMF, Perimeter Institute for Theoretical Physics, and D-Wave Quantum Inc. Virginia, British Columbia, and the NRC contributed more.
The team will test its models on new data to enhance speed and accuracy. They want to upgrade to D-Wave's latest quantum annealer (Advantage2_prototype2.4), which has more couplers per qubit and reduced noise, examine RBM topologies, and modify the decoder module to increase simulation quality.
If scalable, this method can generate synthetic data for manufacturing, healthcare, finance, and other fields beyond particle physics. Since annealing quantum computing will be essential to simulation generation, the authors expect larger-scale quantum-coherent simulations as priors in deep generative models. This work suggests using quantum computing to solve basic physics research problems.
#quantumAI#HighLuminosityLHC#cpu#Boltzmannmachine#Boltzmanndistribution#Chimaeragraphtopology#technology#quantummachinelearningapplications#technews#news#govindhtech
0 notes
Text
Miliardi di finanziamenti per l'High Luminosity LHC

Al CERN un passo avanti verso il super collider. Il consiglio della prestigiosa organizzazione europea di fisica delle particelle ha approvato un progetto da 21 miliardi di euro per un nuovo acceleratore circolare di 100 chilometri con cui proseguire il programma di ricerca del Large Hadron Collider in un dominio di energie ancora più elevato. Il CERN ha compiuto un passo importante verso la costruzione di un super-collider circolare di 100 chilometri per spingere più in là la frontiera della fisica delle alte energie. La decisione è stata sottoscritta all'unanimità dal Consiglio del CERN il 19 giugno, dopo l'approvazione del piano da parte di un comitato indipendente a marzo. La prestigiosa organizzazione europea di fisica delle particelle avrà bisogno di un aiuto globale per finanziare il progetto, che dovrebbe costare almeno 21 miliardi di euro e costituirebbe il proseguimento del famoso Large Hadron Collider del CERN. Entro la metà del secolo, la nuova macchina farà collidere gli elettroni con i loro partner di antimateria, i positroni. Il progetto, che sarà realizzato in un tunnel sotterraneo vicino alla sede del CERN a Ginevra, in Svizzera, permetterà ai fisici di studiare le proprietà del bosone di Higgs e, in seguito, di ospitare una macchina ancora più potente che farà collidere tra loro i protoni e che durerà fino alla seconda metà del secolo. Non si tratta ancora di un via libera definitivo. Read the full article
#acceleratoreparticelle#CERN#fisicaalteenergie#fisicadelleparticelle#follow-up#HighLuminosityLHC#IHEP#Llewellyn-Smith#politichedellaricerca#quantistica#supercollider#teraelettronvolt
0 notes
Text
A major upgrade to the Large Hadron Collider is underway
A major upgrade to the Large Hadron Collider is underway
The Large Hadron Collider (LHC) is getting an upgrade that will let researchers collect approximately 10 times more data than they can now. Currently, the particle acceleratorcan produce up to one billion proton-proton collisions, but that number will be increased significantly once the upgrades are in place. Today, a ground-breaking ceremony kicked off the work that’s scheduled to be…
View On WordPress
#cern#gadgetry#gadgets#gear#higgsboson#highluminositylhc#largehadroncollider#particleaccelerator#video
0 notes