The standard model of the Universe depends on six key numbers. Using a new AI technique, researchers at the Flatiron Institute and their partners have analyzed galaxy distributions to estimate five of these with incredible accuracy.
The new AI method significantly improved past results. Compared to old methods, it reduced uncertainty in measuring the Universe’s matter clumpiness by more than half. Additionally, it matched well with other estimates from observations, like the Universe’s oldest light.
To understand the cosmological parameters better, it is essential to generate tighter constraints on the parameters using the same data.
The six cosmological parameters describe the amount of ordinary matter, dark matter, and dark energy in the Universe and the conditions following the Big Bang, such as the opacity of the newborn Universe as it cooled and whether mass in the cosmos is spread out or in big clumps. Scientists call these parameters the ‘settings’ of the Universe as they determine how the Universe operates on the largest scales.
Studying the clustering of galaxies is one of the most important ways cosmologists calculate the parameter.
Previously, these analyses only examined the large-scale distribution of galaxies. Scientists have yet to be able to go down to small scales due to a lack of a good way of extracting information.
This new study used AI to extract that small-scale information. The plan had two phases. First, scientists trained an AI model to determine the values of the cosmological parameters based on the appearance of simulated universes. Then, they’d show their model actual galaxy distribution observations.
The model was trained by being shown 2,000 box-shaped universes from the CCA-developed Quijote simulation suite; each created using different values for the cosmological parameters.
To give their model realistic practice, the researchers created 2,000 simulated universes that mimicked real galaxy survey data, including imperfections from the atmosphere and telescopes.
By analyzing the simulations, the model learned how cosmological parameters relate to small-scale galaxy clustering, like the distance between pairs of galaxies. It also learned to gather information from larger patterns by examining groups of three or more galaxies and analyzing shapes like long triangles or small equilateral triangles.
Using the trained model, scientists presented it with 109,636 real galaxies measured by the Baryon Oscillation Spectroscopic Survey. The model leveraged small-scale and large-scale details in the data to boost the precision of its cosmological parameter estimates.
The estimates were so accurate they matched what traditional methods would achieve using about four times as many galaxies. This is crucial because the Universe has a finite number of galaxies. With this enhanced precision and less data, SimBIG can push the boundaries of what’s possible.
One exciting application of this precision is addressing the Hubble tension, a cosmological issue where different methods produce conflicting estimates of the Hubble constant, which measures how fast the Universe is expanding.
ChangHoon Hahn, an associate research scholar at Princeton University, said,” If we measure the quantities very precisely and can firmly say that there is a tension, that could reveal new physics about dark energy and the expansion of the Universe.”
Journal Reference:
- Hahn, C., Lemos, P., Parker, L. et al. Cosmological constraints from non-Gaussian and nonlinear galaxy clustering using the SimBIG inference framework. Nat Astron (2024). DOI: 10.1038/s41550-024-02344-2