I measured the electron's vector coupling to the Z boson at SLAC in the late 1990s, and the answer from that measurement is: we don't know yet - and that's the point.
Thirty years later, the discrepancy between my experiment and LEP's hasn't been resolved.
It might be nothing. It might be the first whisper of dark matter or a new force. And the only way to find out is to build the next machine. That's not 'dead', that's science being hard.
My measurement is a thread that's been dangling for decades, waiting to be pulled.
Back then, we thought our theory was more or less complete while having experimental data which disproved it (Michelson-Morley experiment, Mercury perihelion, I am sure there are others).
Right now, we know our theories are incomplete (since GR and QFT are incompatible) while having no experimental data which contradicts them.
The charge of electrons is -1 and protons +1. It has been experimentally measured out to 12 digits or so to be the same magnitude, just opposite charge. However, there are no theories why this is -- they are simply measured and that is it.
It beggars belief that these just happen to be exactly (as far as we can measure) the same magnitude. There almost certainly is a lower level mechanism which explains why they are exactly the same but opposite.
Thousands of people have worked on bringing LHC up during a few decades before, Higgs came to be, across all engineering branches.
This stuff is hard, and there is no roadmap on how to get there.
Physics advances have been generally driven by observation, obtained through better and better instrumentation. We might be entering a long period of technology development, waiting for the moment our measurements can access (either through greater energy or precision) some new physics.
LLMs were a breakthrough I didn't expect and it's likely the last one we'll see in our lifetime.
The best known example is the pre- and post-Copernican conceptions of our relationship to the sun. But long before and ever since: if you show me physics with its wheels slipping in mud I'll show you a culture not yet ready for a new frame.
We are so very attached to the notions of a unique and continuous identity observed by a physically real consciousness observing an unambiguous arrow of time.
Causality. That's what you give up next.
I will commit the first sin, by declaring without fear of contradiction the cat actually IS either alive or dead. it is not in a superposition of states. What is unknown is our knowledge of the state, and what collapses is that uncertainty.
If you shift this to the particle, not the cat, what changes? because if very much changes, my first comment about the unsuitability of the metaphor is upheld, and if very little changes, my comment has been disproven.
It would be clear I am neither a physicist nor a logician.
Nuclear physics (ie, low/medium energy physics) covers diverse topics, many with real world application - yet travels with a lot of the same particles (ie, quarks, gluons). Because it is so diverse, it is not dead/dying in the way HEP is today.
"The analysis has been optimized using neural networks to achieve the smallest expected fractional uncertainty on the t¯t production cross section"
Fun fact: I got to read the thesis of one my uncles who was a young professor back in the 90's. Right when they were discovering bosons. They were already modelling them as tensors back then. And probably multilinear transformations.
Now that I am grown I can understand a little more, I was about 10 years old back then. I had no idea he was studying and teaching the state of the art. xD
I wish those people focus on practical real world physics. So we all can enjoy new innovations.
The problem is that we've mostly explained everything we have easy access to. We simply don't have that many anomalies left. Theoretical physicists were both happy and disappointed that the LHC simply verified everything--theories were correct, but there weren't really any pointers to where to go next.
Quantum gravity seems to be the big one, but that is not something we can penetrate easily. LIGO just came online, and could only really detect enormous events (like black hole mergers).
And while we don't always understand what things do as we scale up or in the aggregate, that doesn't require new physics to explain.
The discovery of the Higgs boson in 2012 completed the Standard Model of particle physics, but the field has since faced a "crisis" due to the lack of new discoveries. The Large Hadron Collider (LHC) has not found any particles or forces beyond the Standard Model, defying theoretical expectations that additional particles would appear to solve the "hierarchy problem"—the unnatural gap between the Higgs mass and the Planck scale. This absence of new physics challenged the "naturalness" argument that had long guided the field.
In 2012, physicist Adam Falkowski predicted the field would undergo a slow decay without new discoveries. Reviewing the state of the field in 2026, he maintains that experimental particle physics is indeed dying, citing a "brain drain" where talented postdocs are leaving the field for jobs in AI and data science. However, the LHC remains operational and is expected to run for at least another decade.
Artificial intelligence is now being integrated into the field to improve data handling. AI pattern recognizers are classifying collision debris more accurately than human-written algorithms, allowing for more precise measurements of "scattering amplitude" or interaction probabilities. Some physicists, like Matt Strassler, argue that new physics might not lie at higher energies but could be hidden in "unexplored territory" at lower energies, such as unstable dark matter particles that decay into muon-antimuon pairs.
CERN physicists have proposed a Future Circular Collider (FCC), a 91-kilometer tunnel that would triple the circumference of the LHC. The plan involves first colliding electrons to measure scattering amplitudes precisely, followed by proton collisions at energies roughly seven times higher than the LHC later in the century. Formal approval and funding for this project are not expected before 2028.
Meanwhile, U.S. physicists are pursuing a muon collider. Muons are elementary particles like electrons but are 200 times heavier, allowing for high-energy, clean collisions. The challenge is that muons are highly unstable and decay in microseconds, requiring rapid acceleration. A June 2025 national report endorsed the program, which is estimated to take about 30 years to develop and cost between $10 and $20 billion.
China has reportedly moved away from plans to build a massive supercollider. Instead, they are favoring a cheaper experiment costing hundreds of millions of dollars—a "super-tau-charm facility"—designed to produce tau particles and charm quarks at lower energies.
On the theoretical side, some researchers have shifted to "amplitudeology," the abstract mathematical study of scattering amplitudes, in hopes of reformulating particle physics equations to connect with quantum gravity. Additionally, Jared Kaplan, a former physicist and co-founder of the AI company Anthropic, suggests that AI progress is outpacing scientific experimentation, positing that future colliders or theoretical breakthroughs might eventually be designed or discovered by AI rather than humans.
- the universe as a Neural Network (yes yes moving the universe model paradigm from the old Clockwork to machine to computer to neural network)
I found it interesting and speculative but also fascinating
See video here:
https://youtu.be/73IdQGgfxas?si=PKyTP8ElWNr87prG
AI summary of the video:
This video discusses Professor Vitaly Vanchurin's theory that the universe is literally a neural network, where learning dynamics are the fundamental physics (0:24). This concept goes beyond simply using neural networks to model physical phenomena; instead, it posits that the universe's own learning process gives rise to physical laws (0:46).
Key takeaways from the discussion include: • The Universe as a Neural Network (0:00-0:57): Vanchurin emphasizes that he is proposing this as a promising model for describing the universe, rather than a definitive statement of its ontological nature (2:48). The core idea is that the learning dynamics, which are typically used to optimize functions in machine learning, are the fundamental physics of the cosmos (6:20). • Deriving Fundamental Field Equations (21:17-22:01): The theory suggests that well-known physics equations, such as Einstein's field equations, Dirac, and Klein-Gordon equations, emerge from the learning process of this neural network universe. • Fermions and Particle Emergence (28:47-32:15): The conversation delves into how particles like fermions could emerge within this framework, with the idea that useful network configurations for learning survive, similar to natural selection. • Emergent Quantum Mechanics (44:53-49:31): The video explores how quantum behaviors, including the Schrödinger equation, could emerge from the two distinct dynamics within the system: activation and learning. This requires the system to have access to a "bath" or "reservoir" of neurons. • Natural Selection at the Subatomic Scale (1:05:10-1:07:34): Vanchurin suggests that natural selection operates on subatomic particles, where configurations that are more useful for minimizing the loss function (i.e., for efficient learning) survive and those that are not are removed. • Consciousness and Observers (1:15:40-1:24:09): The theory integrates the concept of observers into physics, proposing a three-way unification of quantum mechanics, general relativity, and observers. Consciousness is viewed as a measure of learning efficiency within a subsystem (1:30:38).