Generating particle physics Lagrangians with transformers
Generating particle physics Lagrangians with transformers
In physics, Lagrangians provide a systematic way to describe laws governing physical systems. In the context of particle physics, they encode the interactions and behavior of the fundamental building blocks of our universe. By treating Lagrangians as complex, rule-based constructs similar to linguistic expressions, we trained a transformer model -- proven to be effective in natural language tasks -- to predict the Lagrangian corresponding to a given list of particles. We report on the transformer's performance in constructing Lagrangians respecting the Standard Model SU(3)×SU(2)×U(1) gauge symmetries. The resulting model is shown to achieve high accuracies (over 90\%) with Lagrangians up to six matter fields, with the capacity to generalize beyond the training distribution, albeit within architectural constraints. We show through an analysis of input embeddings that the model has internalized concepts such as group representations and conjugation operations as it learned to generate Lagrangians. We make the model and training datasets available to the community. An interactive demonstration can be found at: \url{this https URL}.
cs.LG, cs.SC, hep-ph, hep-th
Koay, Yong Sheng
2a8294bf-c29e-43a3-96fd-4725970b1716
Enberg, Rikard
40ec51ef-1c62-42ea-9ec5-3794149ed09e
Moretti, Stefano
b57cf0f0-4bc3-4e02-96e3-071255366614
Camargo-Molina, Eliel
631199bd-c4bf-47f4-b5ad-271513fce259
Koay, Yong Sheng
2a8294bf-c29e-43a3-96fd-4725970b1716
Enberg, Rikard
40ec51ef-1c62-42ea-9ec5-3794149ed09e
Moretti, Stefano
b57cf0f0-4bc3-4e02-96e3-071255366614
Camargo-Molina, Eliel
631199bd-c4bf-47f4-b5ad-271513fce259
[Unknown type: UNSPECIFIED]
Abstract
In physics, Lagrangians provide a systematic way to describe laws governing physical systems. In the context of particle physics, they encode the interactions and behavior of the fundamental building blocks of our universe. By treating Lagrangians as complex, rule-based constructs similar to linguistic expressions, we trained a transformer model -- proven to be effective in natural language tasks -- to predict the Lagrangian corresponding to a given list of particles. We report on the transformer's performance in constructing Lagrangians respecting the Standard Model SU(3)×SU(2)×U(1) gauge symmetries. The resulting model is shown to achieve high accuracies (over 90\%) with Lagrangians up to six matter fields, with the capacity to generalize beyond the training distribution, albeit within architectural constraints. We show through an analysis of input embeddings that the model has internalized concepts such as group representations and conjugation operations as it learned to generate Lagrangians. We make the model and training datasets available to the community. An interactive demonstration can be found at: \url{this https URL}.
Text
2501.09729v1
- Author's Original
Available under License Other.
More information
Accepted/In Press date: 16 January 2025
Additional Information:
32 pages, 11 figues, 18 tables
Keywords:
cs.LG, cs.SC, hep-ph, hep-th
Identifiers
Local EPrints ID: 499113
URI: http://eprints.soton.ac.uk/id/eprint/499113
PURE UUID: f510a92c-9772-471c-abb1-0d21548d7d20
Catalogue record
Date deposited: 10 Mar 2025 17:34
Last modified: 11 Mar 2025 02:39
Export record
Altmetrics
Contributors
Author:
Yong Sheng Koay
Author:
Rikard Enberg
Author:
Eliel Camargo-Molina
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics