Millennium Run
The Millennium Run, or Millennium Simulation (referring to its size[1][2]) is a computer N-body simulation used to investigate how the distribution of matter in the Universe has evolved over time, in particular, how the observed population of galaxies was formed. It is used by scientists working in physical cosmology to compare observations with theoretical predictions.
Overview
A basic scientific method for testing theories in cosmology is to evaluate their consequences for the observable parts of the universe. One piece of observational evidence is the distribution of matter, including galaxies and intergalactic gas, which are observed today. Light emitted from more distant matter must travel longer in order to reach Earth, meaning looking at distant objects is like looking further back in time. This means the evolution in time of the matter distribution in the universe can also be observed directly.
The Millennium Simulation was run in 2005 by the Virgo Consortium, an international group of astrophysicists from Germany, the United Kingdom, Canada, Japan and the United States. It starts at the epoch when the cosmic background radiation was emitted, about 379,000 years after the universe began. The cosmic background radiation has been studied by satellite experiments, and the observed inhomogeneities in the cosmic background serve as the starting point for following the evolution of the corresponding matter distribution. Using the physical laws expected to hold in the currently known cosmologies and simplified representations of the astrophysical processes observed to affect real galaxies, the initial distribution of matter is allowed to evolve, and the simulation's predictions for formation of galaxies and black holes are recorded.
Since the completion of the Millennium Run simulation in 2005, a series of ever more sophisticated and higher fidelity simulations of the formation of the galaxy population have been built within its stored output and have been made publicly available over the internet. In addition to improving the treatment of the astrophysics of galaxy formation, recent versions have adjusted the parameters of the underlying cosmological model to reflect changing ideas about their precise values. To date (end-2016) more than 850 published papers have made use of data from the Millennium Run, making it, at least by this measure, the highest impact astrophysical simulation of all time.[3]
Size of the simulation
For the first scientific results, published on June 2, 2005, the Millennium Simulation traced 21603, or just over 10 billion, "particles." These are not particles in the particle physics sense – each "particle" represents approximately a billion solar masses of dark matter.[1] The region of space simulated was a cube with about 2 billion light years as its length.[1] This volume was populated by about 20 million "galaxies". A super computer located in Garching, Germany executed the simulation, which used a version of the GADGET code, for more than a month. The output of the simulation needed about 25 terabytes of storage.[4]
First results
The Sloan Digital Sky Survey had challenged the current understanding of cosmology by finding black hole candidates in very bright quasars at large distances. This meant that they were created much earlier than initially expected. In successfully managing to produce quasars at early times, the Millennium Simulation demonstrated that these objects do not contradict our models of the evolution of the universe.
Millennium II
In 2009, the same group ran the 'Millennium II' simulation (MS-II) on a smaller cube (about 400 million light years on a side), with the same number of particles but with each particle representing 6.9 million solar masses. This is a rather harder numerical task since splitting the computational domain between processors becomes harder when dense clumps of matter are present. MS-II used 1.4 million CPU hours over 2048 cores (i.e. about a month) on the Power-6 computer at Garching; a simulation was also run with the same initial conditions and fewer particles to check that features in the higher-resolution run were also seen at lower resolution.
Millennium XXL
In 2010, the 'Millennium XXL' simulation (MXXL) was performed, this time using a much larger cube (over 13 billion light years on a side), and 67203 particles each representing 7 billion times the mass of the Sun. The MXXL spans a cosmological volume 216 and 27,000 times the size of the Millennium and the MS-II simulation boxes, respectively. The simulation was run on JUROPA, one of the top 15 supercomputers in the world in 2010. It used more than 12,000 cores for an equivalent of 300 years CPU time, 30 terabytes of RAM and generated more than 100 terabytes of data.[5] Cosmologists use the MXXL simulation to study the distribution of galaxies and dark matter halos on very large scales and how the rarest and most massive structures in the universe came about.
Millennium Run Observatory
In 2012, the Millennium Run Observatory (MRObs) project was launched. The MRObs is a theoretical virtual observatory that integrates detailed predictions for the dark matter (from the Millennium simulations) and for the galaxies (from semi-analytical models) with a virtual telescope to synthesize artificial observations. Astrophysicists use these virtual observations to study how the predictions from the Millennium simulations compare to the real universe, to plan future observational surveys, and to calibrate the techniques used by astronomers to analyze real observations. A first set of virtual observations produced by the MRObs have been released to the astronomical community for analysis through the MRObs Web portal. The virtual universe can also be accessed through a new online tool, the MRObs browser, which allows users to interact with the Millennium Run Relational Database where the properties of millions of dark matter halos and their galaxies from the Millennium project are being stored. Upgrades to the MRObs framework, and its extension to other types of simulations, are currently being planned.
See also
References
- 1 2 3 Springel, Volker; et al. (2005). "Simulations of the formation, evolution, and clustering of galaxies and quasars". Nature. 435 (7042): 629–636. Bibcode:2005Natur.435..629S. PMID 15931216. arXiv:astro-ph/0504097 . doi:10.1038/nature03597.
- ↑ "MPA :: Current Research Highlight :: August 2004". Retrieved 2009-05-28.
- ↑ "The Millennium Simulation public page". Retrieved 2017-02-15.
- ↑ "Millennium Simulation - The Largest Ever Model of the Universe". Retrieved 2009-05-28.
- ↑ "The Millennium-XXL Project: Simulating the Galaxy Population in Dark Energy Universes". Retrieved 2013-07-02.
Further reading
- Springel, Volker; et al. (2005). "Simulations of the formation, evolution, and clustering of galaxies and quasars". Nature. 435 (7042): 629–636. Bibcode:2005Natur.435..629S. PMID 15931216. arXiv:astro-ph/0504097 . doi:10.1038/nature03597.
- Boylan-Kolchin, Michael; et al. (2009). "Resolving Cosmic Structure Formation with the Millennium-II Simulation". Monthly Notices of the Royal Astronomical Society. 398 (3): 1150–116?. Bibcode:2009MNRAS.398.1150B. arXiv:0903.3041 . doi:10.1111/j.1365-2966.2009.15191.x.
- Angulo, Raul; et al. (2012). "Scaling relations for galaxy clusters in the Millennium-XXL simulation". Bibcode:2012MNRAS.426.2046A. arXiv:1203.3216 . doi:10.1111/j.1365-2966.2012.21830.x.
- Overzier, Roderik; et al. (2012). "The Millennium Run Observatory: First Light". Bibcode:2013MNRAS.428..778O. arXiv:1206.6923 . doi:10.1093/mnras/sts076.
- Lemson, Gerard; Virgo Consortium (2006). "Halo and Galaxy Formation Histories from the Millennium Simulation: Public release of a VO-oriented and SQL-queryable database for studying the evolution of galaxies in the LambdaCDM cosmogony". Bibcode:2006astro.ph..8019L. arXiv:astro-ph/0608019 .
External links
- Millennium Simulation Data Page
- Press release of the June 2 results (MPG)
- VIRGO home page
- Simulating the joint evolution of quasars, galaxies and their large-scale distribution
- The Millennium Run Observatory Page
- The Millennium Run Relational Database