It's basically the opposite situation from 150 years ago.
Back then, we thought our theory was more or less complete while having experimental data which disproved it (Michelson-Morley experiment, Mercury perihelion, I am sure there are others).
Right now, we know our theories are incomplete (since GR and QFT are incompatible) while having no experimental data which contradicts them.
Here is one fact that seems, to me, pretty convincing that there is another layer underneath what we know.
The charge of electrons is -1 and protons +1. It has been experimentally measured out to 12 digits or so to be the same magnitude, just opposite charge. However, there are no theories why this is -- they are simply measured and that is it.
It beggars belief that these just happen to be exactly (as far as we can measure) the same magnitude. There almost certainly is a lower level mechanism which explains why they are exactly the same but opposite.
As CERN Alumni, this isn't easy, the data is endless, processing it takes take, usually everything is new technology, and also needs to be validated before being put into use.
Thousands of people have worked on bringing LHC up during a few decades before, Higgs came to be, across all engineering branches.
This stuff is hard, and there is no roadmap on how to get there.
It's hard. Particle physics faces the problem that in order to dig down to ever smaller scales, ironically, ever larger experiments are needed. We've pretty much built large enough colliders for our current understanding. No one really knows how much more energy would be needed to expose something new - it might be incremental, within current technical reach, or it might be many orders of magnitude beyond our current capabilities. The experiments have become expensive enough that there isn't a lot of appetite to build giant new systems without some really good reason. The hard part is coming up with a theory to justify the outlay, if you can't generate compelling data from existing systems.
Physics advances have been generally driven by observation, obtained through better and better instrumentation. We might be entering a long period of technology development, waiting for the moment our measurements can access (either through greater energy or precision) some new physics.
All of science is getting harder as the easiest discoveries are all pretty much behind us.
LLMs were a breakthrough I didn't expect and it's likely the last one we'll see in our lifetime.
It is almost always the case that when progress stops for some meaningful period of time that a parochial taboo would need violating to move forwards.
The best known example is the pre- and post-Copernican conceptions of our relationship to the sun. But long before and ever since: if you show me physics with its wheels slipping in mud I'll show you a culture not yet ready for a new frame.
We are so very attached to the notions of a unique and continuous identity observed by a physically real consciousness observing an unambiguous arrow of time.
Causality. That's what you give up next.
I am sure others will say it better, but the cat-in-the-box experiment is a shockingly bad metaphor for the idea behind quantum states and observer effect.
I will commit the first sin, by declaring without fear of contradiction the cat actually IS either alive or dead. it is not in a superposition of states. What is unknown is our knowledge of the state, and what collapses is that uncertainty.
If you shift this to the particle, not the cat, what changes? because if very much changes, my first comment about the unsuitability of the metaphor is upheld, and if very little changes, my comment has been disproven.
It would be clear I am neither a physicist nor a logician.
I never liked that the physics community shifted from 'high energy' particle physics (the topic of the article) to referring to this branch as just 'particle physics' which I think leaves the impression that anything to do with 'particles' is now a dead end.
Nuclear physics (ie, low/medium energy physics) covers diverse topics, many with real world application - yet travels with a lot of the same particles (ie, quarks, gluons). Because it is so diverse, it is not dead/dying in the way HEP is today.
Maybe a dumb question here, but how would they discover a dark matter particle, if dark matter is basically invisible to us except for its gravitational effects?
The use of "AI" in particle physics is not new. In 1999 they were using neural nets to compute various results. Here's one from Measurement of the top quark pair production cross section in p¯p collisions using multijet final states [https://repository.ias.ac.in/36977/1/36977.pdf]
"The analysis has been optimized using neural networks to achieve the smallest expected fractional uncertainty on the t¯t production cross section"
None of the comments seem to mention that it's also really really really really really expensive.
It's impossible to tell without opening the box the particle physics is in.
I find the arguments from those who say there is no crisis convincing. Progress doesn’t happen at a constant rate. We made incredible unprecedented progress in the 20th century. The most likely scenario is that to slow down for a while. Perhaps hundreds of years again! Nobody can know. We are still making enormous strides compared to most of scientific history.
Its probably just very hard, in my opinion as a physicist
One interesting gap in the standard model is why neutrinos have mass: https://cerncourier.com/a/the-neutrino-mass-puzzle/
It's just hard. I mean... it could very well be, that there's so many deeper layers underneath what we know in particle physics, but from our scale, also so infeasible to build something to analyze and decompose the nuanced behavior happening at that level, to the point that it's practically impossible to do so. Just like it is impossible to split an atom with your bare hands...
This feels less like a story about particle physics "failing" and more like a story about a field running out of easy leverage
Isn't it the mathematics that is lagging? Amplituhedron? Higher dimensional models?
Fun fact: I got to read the thesis of one my uncles who was a young professor back in the 90's. Right when they were discovering bosons. They were already modelling them as tensors back then. And probably multilinear transformations.
Now that I am grown I can understand a little more, I was about 10 years old back then. I had no idea he was studying and teaching the state of the art. xD
To my uneducated eye it looks like they are stuck in limbo for 120 years. Nothing practical has been create based on those theories. It is just words and calculations spinning in circles.
I wish those people focus on practical real world physics. So we all can enjoy new innovations.
When the model appears to have massive problems, maybe it's time to go back and revise it.
Is it more that even the most dedicated and passionate researchers have to frame their interests in a way that will get funding? Particle Physics right now is not the thing those with the cash will fund right now. AI and QC is the focus.
Maybe this is all we can learn from home and we need to get out more.
Theoretical physics progresses via the anomalies it can't explain.
The problem is that we've mostly explained everything we have easy access to. We simply don't have that many anomalies left. Theoretical physicists were both happy and disappointed that the LHC simply verified everything--theories were correct, but there weren't really any pointers to where to go next.
Quantum gravity seems to be the big one, but that is not something we can penetrate easily. LIGO just came online, and could only really detect enormous events (like black hole mergers).
And while we don't always understand what things do as we scale up or in the aggregate, that doesn't require new physics to explain.
It is obviously not dead but it should be dead: Almost all of the technical and economic progress made in the last century was achieved with macroscopic quantum effects. Particle physics spends a lot of energy and material resources to measure microscopic effects. The priorities are esentially inverted. At this point it is not even about discovery. Experiments are relegated to precision measurements. What practical use will it be if we know the mass/charge distribution/polarizability of some particles more precicely by a few percent? About nothing.
Information content of the article:
The discovery of the Higgs boson in 2012 completed the Standard Model of particle physics, but the field has since faced a "crisis" due to the lack of new discoveries. The Large Hadron Collider (LHC) has not found any particles or forces beyond the Standard Model, defying theoretical expectations that additional particles would appear to solve the "hierarchy problem"—the unnatural gap between the Higgs mass and the Planck scale. This absence of new physics challenged the "naturalness" argument that had long guided the field.
In 2012, physicist Adam Falkowski predicted the field would undergo a slow decay without new discoveries. Reviewing the state of the field in 2026, he maintains that experimental particle physics is indeed dying, citing a "brain drain" where talented postdocs are leaving the field for jobs in AI and data science. However, the LHC remains operational and is expected to run for at least another decade.
Artificial intelligence is now being integrated into the field to improve data handling. AI pattern recognizers are classifying collision debris more accurately than human-written algorithms, allowing for more precise measurements of "scattering amplitude" or interaction probabilities. Some physicists, like Matt Strassler, argue that new physics might not lie at higher energies but could be hidden in "unexplored territory" at lower energies, such as unstable dark matter particles that decay into muon-antimuon pairs.
CERN physicists have proposed a Future Circular Collider (FCC), a 91-kilometer tunnel that would triple the circumference of the LHC. The plan involves first colliding electrons to measure scattering amplitudes precisely, followed by proton collisions at energies roughly seven times higher than the LHC later in the century. Formal approval and funding for this project are not expected before 2028.
Meanwhile, U.S. physicists are pursuing a muon collider. Muons are elementary particles like electrons but are 200 times heavier, allowing for high-energy, clean collisions. The challenge is that muons are highly unstable and decay in microseconds, requiring rapid acceleration. A June 2025 national report endorsed the program, which is estimated to take about 30 years to develop and cost between $10 and $20 billion.
China has reportedly moved away from plans to build a massive supercollider. Instead, they are favoring a cheaper experiment costing hundreds of millions of dollars—a "super-tau-charm facility"—designed to produce tau particles and charm quarks at lower energies.
On the theoretical side, some researchers have shifted to "amplitudeology," the abstract mathematical study of scattering amplitudes, in hopes of reformulating particle physics equations to connect with quantum gravity. Additionally, Jared Kaplan, a former physicist and co-founder of the AI company Anthropic, suggests that AI progress is outpacing scientific experimentation, positing that future colliders or theoretical breakthroughs might eventually be designed or discovered by AI rather than humans.
It's kind of legitimate, but it's kind of sad to see some of the smartest people in society just being like "maybe AI will just give me the answer," a phrase that has a lot of potential to be thought terminating.
[dead]
Curious what everyone thinks about this physicists idea
- the universe as a Neural Network (yes yes moving the universe model paradigm from the old Clockwork to machine to computer to neural network)
I found it interesting and speculative but also fascinating
See video here:
https://youtu.be/73IdQGgfxas?si=PKyTP8ElWNr87prG
AI summary of the video:
This video discusses Professor Vitaly Vanchurin's theory that the universe is literally a neural network, where learning dynamics are the fundamental physics (0:24). This concept goes beyond simply using neural networks to model physical phenomena; instead, it posits that the universe's own learning process gives rise to physical laws (0:46).
Key takeaways from the discussion include: • The Universe as a Neural Network (0:00-0:57): Vanchurin emphasizes that he is proposing this as a promising model for describing the universe, rather than a definitive statement of its ontological nature (2:48). The core idea is that the learning dynamics, which are typically used to optimize functions in machine learning, are the fundamental physics of the cosmos (6:20). • Deriving Fundamental Field Equations (21:17-22:01): The theory suggests that well-known physics equations, such as Einstein's field equations, Dirac, and Klein-Gordon equations, emerge from the learning process of this neural network universe. • Fermions and Particle Emergence (28:47-32:15): The conversation delves into how particles like fermions could emerge within this framework, with the idea that useful network configurations for learning survive, similar to natural selection. • Emergent Quantum Mechanics (44:53-49:31): The video explores how quantum behaviors, including the Schrödinger equation, could emerge from the two distinct dynamics within the system: activation and learning. This requires the system to have access to a "bath" or "reservoir" of neurons. • Natural Selection at the Subatomic Scale (1:05:10-1:07:34): Vanchurin suggests that natural selection operates on subatomic particles, where configurations that are more useful for minimizing the loss function (i.e., for efficient learning) survive and those that are not are removed. • Consciousness and Observers (1:15:40-1:24:09): The theory integrates the concept of observers into physics, proposing a three-way unification of quantum mechanics, general relativity, and observers. Consciousness is viewed as a measure of learning efficiency within a subsystem (1:30:38).
Why are we even trying to look deeper? To fit our mathematical curves better? Abstract spacetime, fields, virtual particles, wave function collapse, quantized energy, wave particle duality, etc. This is all BS. And I'm not disputing the theories or the experimental results. These concepts are unintelligible. They are self contradictory. They are not even abstractions, they are mutually exclusive paradigms forced together into a bewilderment. I'm not disputing that the math fits the observations. But these are not explanations. If this is what it's come to, all we can expect from here on is to better fit the math to the observation. And in the end, an equation that tells us nothing about what we really wanted to know, like "what is it really"? Nobody is going to be satisfied with an equation, so why are we still funding this enterprise, for better lasers to kill bad guys?
Maybe it's time for physicists to switch to agile? Don't try to solve the theory of the Universe at once; that's the waterfall model. Try to come up with just a single new equation each sprint!
Experimental particle physicist here. It's just hard.
I measured the electron's vector coupling to the Z boson at SLAC in the late 1990s, and the answer from that measurement is: we don't know yet - and that's the point.
Thirty years later, the discrepancy between my experiment and LEP's hasn't been resolved.
It might be nothing. It might be the first whisper of dark matter or a new force. And the only way to find out is to build the next machine. That's not 'dead', that's science being hard.
My measurement is a thread that's been dangling for decades, waiting to be pulled.