← Back to Timeline

Equivariant Symmetry Breaking Sets

Astrophysics

Authors

YuQing Xie, Tess Smidt

Abstract

Equivariant neural networks (ENNs) have been shown to be extremely effective in applications involving underlying symmetries. By construction ENNs cannot produce lower symmetry outputs given a higher symmetry input. However, symmetry breaking occurs in many physical systems and we may obtain a less symmetric stable state from an initial highly symmetric one. Hence, it is imperative that we understand how to systematically break symmetry in ENNs. In this work, we propose a novel symmetry breaking framework that is fully equivariant and is the first which fully addresses spontaneous symmetry breaking. We emphasize that our approach is general and applicable to equivariance under any group. To achieve this, we introduce the idea of symmetry breaking sets (SBS). Rather than redesign existing networks, we design sets of symmetry breaking objects which we feed into our network based on the symmetry of our inputs and outputs. We show there is a natural way to define equivariance on these sets, which gives an additional constraint. Minimizing the size of these sets equates to data efficiency. We prove that minimizing these sets translates to a well studied group theory problem, and tabulate solutions to this problem for the point groups. Finally, we provide some examples of symmetry breaking to demonstrate how our approach works in practice. The code for these examples is available at \url{https://github.com/atomicarchitects/equivariant-SBS}.

Concepts

equivariant neural networks symmetry breaking symmetry breaking sets group theory normalizer constraint geometric deep learning point group symmetry symmetry preservation phase transitions crystal structure

The Big Picture

Imagine a perfectly symmetric snowflake melting and refreezing into a lopsided crystal. The laws governing water molecules are completely symmetric; they don’t prefer any direction. Yet nature routinely produces structures that break that symmetry. This is one of the deep puzzles in physics, and it turns out to be a serious problem for some of the most powerful AI tools scientists use today.

A special class of AI called equivariant neural networks (ENNs) has become a staple in computational physics and chemistry. They’re built with a clever guarantee: rotate your molecule, and the network’s predictions rotate with it. Feed in a crystal structure, and the output respects the same spatial symmetries. This property, called equivariance, means the network needs far less training data because it never has to rediscover the same physics twice from different orientations.

The catch? That guarantee becomes a straitjacket. By construction, an ENN cannot predict an output with lower symmetry than its input. You can’t get a lopsided crystal from a perfectly symmetric starting configuration — not because the physics forbids it, but because the math of the network does.

MIT researchers YuQing Xie and Tess Smidt have cracked this problem open. In a paper published in Transactions on Machine Learning Research, they introduce symmetry breaking sets (SBS), a mathematically rigorous, fully equivariant framework for teaching neural networks to break symmetry without giving up the properties that make ENNs work.

Key Insight: You don’t need to redesign your neural network to handle symmetry breaking. Instead, design a carefully chosen set of “symmetry breaking objects” as additional inputs, and the math of group theory tells you exactly how to pick them.

How It Works

The paper draws a sharp distinction between two flavors of symmetry breaking. Explicit symmetry breaking happens when the underlying laws themselves are asymmetric. Spontaneous symmetry breaking is subtler and more physically fundamental: the laws are perfectly symmetric, but the system settles into one of several equally valid low-symmetry states.

Think of a ball balanced atop a Mexican-hat-shaped hill. The hill is perfectly symmetric around the peak, but the ball must fall somewhere, rolling into one of infinitely many equivalent positions around the rim. This is the regime that broke earlier approaches.

Figure 1

Rather than tinkering with the internals of an existing equivariant network, Xie and Smidt propose feeding the network additional “symmetry breaking objects” alongside the original input. These objects carry just enough asymmetry to nudge the network toward one valid low-symmetry output. The trick is designing them so the whole input (original data plus symmetry breaker) still transforms correctly whenever the system is rotated, reflected, or reoriented.

Here’s what “correctly” means in practice:

  • The set must be closed under the normalizer. When you rotate or reorient the input, the symmetry-breaking objects must rearrange themselves in a matching way. The normalizer of a symmetry group is the set of all transformations that preserve the group’s structure, i.e., all the ways you can reorient data without changing its symmetry type.
  • Minimizing set size maximizes data efficiency. A smaller SBS means fewer symmetry-breaking copies of your data to process. The authors prove that finding the minimum-size equivariant SBS is mathematically equivalent to finding complements of subgroups: the minimal extra structure needed to complete a partial symmetry into a full one.
  • Counterintuitive bonus: Sometimes it’s more efficient to break more symmetry than strictly necessary. Breaking all symmetry of the input can yield a smaller, cleaner SBS than carefully preserving some symmetries.

Figure 2

The researchers tabulate concrete solutions for all point groups, the symmetry groups describing molecules and crystals. This is the kind of reference table computational physicists will actually use: pick your input symmetry, pick your desired output symmetry, look up the table, and you have your SBS recipe.

To demonstrate the framework, the paper walks through crystal distortions from high- to low-symmetry phases, ground states that break Hamiltonian symmetry, and fluid dynamics simulations where symmetric flows spontaneously develop asymmetric Kármán vortex streets. In each case, a standard equivariant network augmented with an SBS produces the full set of valid lower-symmetry outputs, something that was previously out of reach.

Why It Matters

Spontaneous symmetry breaking is not an edge case. The Higgs mechanism, which gives fundamental particles their mass, is a spontaneous symmetry breaking event. Phase transitions in materials, from ferromagnetism to superconductivity, involve it. So does the formation of large-scale structure in the early universe.

As machine learning becomes increasingly central to physics research, the inability to handle spontaneous symmetry breaking is a real scientific obstacle. The SBS framework removes it, and it’s plug-and-play: researchers don’t need to discard existing equivariant models or retrain from scratch. They simply augment their inputs using the tabulated SBS designs. The code is open-source.

That low barrier to adoption means the framework could spread quickly to molecular dynamics, crystal structure prediction, quantum chemistry, and PDE solvers (software for simulating fluid flow, heat transfer, and other physical processes), anywhere that equivariant networks are already deployed and symmetry breaking shows up.

Bottom Line: Xie and Smidt have solved the spontaneous symmetry breaking problem for equivariant neural networks, not by breaking the networks, but by giving them the right extra information. The result is a general, mathematically rigorous, and practically usable framework built on classical group theory.

IAIFI Research Highlights

Interdisciplinary Research Achievement
This work translates a classical mathematical concept (subgroup complements) into a concrete engineering recipe for AI systems that model physical symmetry breaking, bridging abstract group theory and practical machine learning for physics.
Impact on Artificial Intelligence
The symmetry breaking set framework is the first fully equivariant solution to spontaneous symmetry breaking in neural networks, letting ENNs model a class of physical phenomena that was previously beyond their reach while keeping their core mathematical guarantees intact.
Impact on Fundamental Interactions
Spontaneous symmetry breaking is the mechanism behind the Higgs field, phase transitions, and crystal distortions. By enabling AI models to handle it correctly, this work opens the door to deploying machine learning more broadly in quantum chemistry, condensed matter physics, and particle physics.
Outlook and References
Future work includes extending the framework to continuous groups and infinite-dimensional symmetries; the full tabulation of point group solutions is available in the paper (*Transactions on Machine Learning Research*, October 2024; [arXiv:2402.02681](https://arxiv.org/abs/2402.02681); code at github.com/atomicarchitects/equivariant-SBS).