Tagged: Dark Energy and Dark Matter Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:01 pm on August 5, 2019 Permalink | Reply
    Tags: "Large cosmological simulation to run on Mira", , , , , , Dark Energy and Dark Matter,   

    From Argonne Leadership Computing Facility: “Large cosmological simulation to run on Mira” 

    Argonne Lab
    News from Argonne National Laboratory

    From Argonne Leadership Computing Facility

    An extremely large cosmological simulation—among the five most extensive ever conducted—is set to run on Mira this fall and exemplifies the scope of problems addressed on the leadership-class supercomputer at the U.S. Department of Energy’s (DOE’s) Argonne National Laboratory.

    MIRA IBM Blue Gene Q supercomputer at the Argonne Leadership Computing Facility

    Argonne physicist and computational scientist Katrin Heitmann leads the project. Heitmann was among the first to leverage Mira’s capabilities when, in 2013, the IBM Blue Gene/Q system went online at the Argonne Leadership Computing Facility (ALCF), a DOE Office of Science User Facility. Among the largest cosmological simulations ever performed at the time, the Outer Rim Simulation she and her colleagues carried out enabled further scientific research for many years.

    For the new effort, Heitmann has been allocated approximately 800 million core-hours to perform a simulation that reflects cutting-edge observational advances from satellites and telescopes and will form the basis for sky maps used by numerous surveys. Evolving a massive number of particles, the simulation is designed to help resolve mysteries of dark energy and dark matter.

    “By transforming this simulation into a synthetic sky that closely mimics observational data at different wavelengths, this work can enable a large number of science projects throughout the research community,” Heitmann said. “But it presents us with a big challenge.” That is, in order to generate synthetic skies across different wavelengths, the team must extract relevant information and perform analysis either on the fly or after the fact in post-processing. Post-processing requires the storage of massive amounts of data—so much, in fact, that merely reading the data becomes extremely computationally expensive.

    Since Mira was launched, Heitmann and her team have implemented in their Hardware/Hybrid Accelerated Cosmology Code (HACC) more sophisticated analysis tools for on-the-fly processing. “Moreover, compared to the Outer Rim Simulation, we’ve effected three major improvements,” she said. “First, our cosmological model has been updated so that we can run a simulation with the best possible observational inputs. Second, as we’re aiming for a full-machine run, volume will be increased, leading to better statistics. Most importantly, we set up several new analysis routines that will allow us to generate synthetic skies for a wide range of surveys, in turn allowing us to study a wide range of science problems.”

    The team’s simulation will address numerous fundamental questions in cosmology and is essential for enabling the refinement of existing predictive tools and aid the development of new models, impacting both ongoing and upcoming cosmological surveys, including the Dark Energy Spectroscopic Instrument (DESI), the Large Synoptic Survey Telescope (LSST), SPHEREx, and the “Stage-4” ground-based cosmic microwave background experiment (CMB-S4).

    LBNL/DESI spectroscopic instrument on the Mayall 4-meter telescope at Kitt Peak National Observatory starting in 2018


    NOAO/Mayall 4 m telescope at Kitt Peak, Arizona, USA, Altitude 2,120 m (6,960 ft)

    LSST

    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.


    LSST Data Journey, Illustration by Sandbox Studio, Chicago with Ana Kova

    NASA’s SPHEREx Spectro-Photometer for the History of the Universe, Epoch of Reionization and Ices Explorer depiction

    4

    The value of the simulation derives from its tremendous volume (which is necessary to cover substantial portions of survey areas) and from attaining levels of mass and force resolution sufficient to capture the small structures that host faint galaxies.

    The volume and resolution pose steep computational requirements, and because they are not easily met, few large-scale cosmological simulations are carried out. Contributing to the difficulty of their execution is the fact that the memory footprints of supercomputers have not advanced proportionally with processing speed in the years since Mira’s introduction. This makes that system, despite its relative age, rather optimal for a large-scale campaign when harnessed in full.

    “A calculation of this scale is just a glimpse at what the exascale resources in development now will be capable of in 2021/22,” said Katherine Riley, ALCF Director of Science. “The research community will be taking advantage of this work for a very long time.”

    Funding for the simulation is provided by DOE’s High Energy Physics program. Use of ALCF computing resources is supported by DOE’s Advanced Scientific Computing Research program.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    About ALCF
    The Argonne Leadership Computing Facility’s (ALCF) mission is to accelerate major scientific discoveries and engineering breakthroughs for humanity by designing and providing world-leading computing facilities in partnership with the computational science community.

    We help researchers solve some of the world’s largest and most complex problems with our unique combination of supercomputing resources and expertise.

    ALCF projects cover many scientific disciplines, ranging from chemistry and biology to physics and materials science. Examples include modeling and simulation efforts to:

    Discover new materials for batteries
    Predict the impacts of global climate change
    Unravel the origins of the universe
    Develop renewable energy technologies

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

     
  • richardmitnick 9:13 pm on April 17, 2018 Permalink | Reply
    Tags: And yet - here we are, , , , , Dark Energy and Dark Matter, What Astronomers Wish Everyone Knew About Dark Matter And Dark Energy   

    From Ethan Siegel: “What Astronomers Wish Everyone Knew About Dark Matter And Dark Energy” 

    From Ethan Siegel
    Apr 17, 2018

    1
    One way of measuring the Universe’s expansion history involves going all the way back to the first light we can see, when the Universe was just 380,000 years old. The other ways don’t go backwards nearly as far, but also have a lesser potential to be contaminated by systematic errors. (European Southern Observatory).

    Among the general public, people compare it to the aether, phlogiston, or epicycles. Yet almost all astronomers are certain: dark matter and dark energy exist. Here’s why.

    If you go by what’s often reported in the news, you’d be under the impression that dark matter and dark energy are houses of cards just waiting to be blown down. Theorists are constantly exploring other options; individual galaxies and their satellites arguably favor some modification of gravity to dark matter; there are big controversies over just how fast the Universe is expanding, and the conclusions we’ve drawn from supernova data may need to be altered. Given that we’ve made mistaken assumptions in the past by presuming that the unseen Universe contained substances that simply weren’t there, from the aether to phlogiston, isn’t it a greater leap-of-faith to assume that 95% of the Universe is some invisible, unseen form of energy than it is to assume there’s just a flaw in the law of gravity?

    The answer is a resounding, absolute no, according to almost all astronomers, astrophysicists, and cosmologists who study the Universe. Here’s why.

    Cosmology is the science of what the Universe is, how it came to be this way, what its fate is, and what it’s made up of. Originally, these questions were in the realms of poets, philosophers and theologians, but the 20th century brought these questions firmly into the realm of science. When Einstein put forth his theory of General Relativity, one of the first things that was realized is if you fill the space that makes up the Universe with any form of matter or energy, it immediately becomes unstable. If space contains matter and energy, it can expand or contract, but all static solutions are unstable. Once we measured the Hubble expansion of the Universe and discovered the leftover glow from the Big Bang in the form of the Cosmic Microwave Background, cosmology became a quest to measure two numbers: the expansion rate itself and how that rate changed over time. Measure those, and General Relativity tells you everything you could want to know about the Universe.

    COBE CMB

    NASA/COBE 1989 to 1993.

    Cosmic Microwave Background NASA/WMAP

    NASA WMAP 2001 to 2010

    CMB per ESA/Planck


    ESA/Planck 2009 to 2013

    3
    A plot of the apparent expansion rate (y-axis) vs. distance (x-axis) is consistent with a Universe that expanded faster in the past, but is still expanding today. This is a modern version of, extending thousands of times farther than, Hubble’s original work. Note the fact that the points do not form a straight line, indicating the expansion rate’s change over time. (Ned Wright, based on the latest data from Betoule et al. (2014))

    These two numbers, known as H_0 and q_0, are called the Hubble parameter and the deceleration parameter, respectively. If you take a Universe that’s filled with stuff, and start it off expanding at a particular rate, you’d fully expect it to have those two major physical phenomena — gravitational attraction and the initial expansion — fight against each other. Depending on how it all turned out, the Universe ought to follow one of three paths:

    The Universe expands fast enough that even with all the matter and energy in the Universe, it can slow the expansion down but never reverse it. In this case, the Universe expands forever.
    The Universe begins expanding quickly, but there’s too much matter and energy. The expansion slows, comes to a halt, reverses, and the Universe eventually recollapses.
    Or, perhaps, the Universe — like the third bowl of porridge in Goldilocks — is just right. Perhaps the expansion rate and the amount of stuff in the Universe are perfectly balanced, with the expansion rate asymptoting to zero.

    That last case can only occur if the energy density of the Universe equals some perfectly balanced value: the critical density.

    4
    The expected fates of the Universe (top three illustrations) all correspond to a Universe where the matter and energy fights against the initial expansion rate. In our observed Universe, a cosmic acceleration is caused by some type of dark energy, which is hitherto unexplained. (E. Siegel / Beyond the Galaxy)

    This is actually a beautiful setup, because the equations you derive from General Relativity are completely deterministic here. Measure how the Universe is expanding today and how it was expanding in the past, and you know exactly what the Universe must be made out of. You can derive how old the Universe has to be, how much matter and radiation (and curvature, and any other stuff) has to be in it, and all sorts of other interesting information. If we could know those two numbers exactly, H_0 and q_0, we would immediately know both the Universe’s age and also what the Universe is made out of.

    5
    Three different types of measurements, distant stars and galaxies, the large scale structure of the Universe, and the fluctuations in the CMB, tell us the expansion history of the Universe. (NASA/ESA HUbble, Sloan Digital Sky Survey, ESA and the Planck Collaboration [ESA/Planck pictured above)

    NASA/ESA Hubble Telescope


    SDSS Telescope at Apache Point Observatory, near Sunspot NM, USA, Altitude 2,788 meters (9,147 ft)

    Now, we had some preconceptions when we started down this path. For aesthetic or mathematically prejudicial reasons, some people preferred the recollapsing Universe, while others preferred the critical Universe and still others preferred the open one. In reality, all you can do, if you want to understand the Universe, is examine it and ask it what it’s made of. Our laws of physics tell us what rules the Universe plays by; the rest is determined by measurement. For a long time, measurements of the Hubble constant were highly uncertain, but one thing became clear: if the Universe were made 100% of normal matter, the Universe turned out to be very young.

    6
    Measuring back in time and distance (to the left of “today”) can inform how the Universe will evolve and accelerate/decelerate far into the future. We can learn that acceleration turned on about 7.8 billion years ago with the current data, but also learn that the models of the Universe without dark energy have either Hubble constants that are too low or ages that are too young to match with observations. (Saul Perlmutter, Nobel Laureate, of Berkeley)

    If the expansion rate, H_0, was fast, like 100 km/s/Mpc, the Universe would only be 6.5 billion years old. Given that the ages of stars in globular clusters — admittedly, some of the oldest stars in the Universe — were at least 12 billion years old (and many cited numbers closer to 14–16 billion), the Universe couldn’t be this young. While some measurements of H_0 were significantly lower, like 55 km/s/Mpc, that still gave a Universe that was 11-and-change billion: still younger than the stars we found within it. Moreover, as more and more measurements came in during the 1970s, 1980s and beyond, it became clear that an abnormally low Hubble constant in the 40s or 50s, simply didn’t line up with the data.

    7
    The globular cluster Messier 75, showing a huge central concentration, is over 13 billion years old. Many globular clusters have stellar populations that are in excess of 12 or even 13 billion years, a challenge for ‘matter-only’ models of the Universe. (HST / Fabian RRRR, with data from the Hubble Legacy Archive)

    At the same time, we were beginning to measure to good precision how abundant the light elements in the Universe were. Big Bang Nucleosynthesis is the science of how much relative hydrogen, helium-4, helium-3, deuterium, and lithium-7 ought to be left over from the Big Bang. The only parameter that isn’t derivable from physical constants in these calculation is the baryon-to-photon ratio, which tells you the density of normal matter in the Universe. (This is relative to the number density of photons, but that is easily measurable from the Cosmic Microwave Background.) While there was some uncertainty at the time, it became clear very quickly that 100% of the matter couldn’t be “normal,” but only about 10% at most. There is no way the laws of physics could be correct and give you a Universe with 100% normal matter.

    8
    The predicted abundances of helium-4, deuterium, helium-3 and lithium-7 as predicted by Big Bang Nucleosynthesis, with observations shown in the red circles. This corresponds to a Universe where the baryon density (normal matter density) is only 5% of the critical value. (NASA / WMAP Science Team)

    By the early 1990s, this began to line up with a slew of observations that all pointed to pieces of this cosmic puzzle:

    The oldest stars had to be at least 13 billion years old,
    If the Universe were made of 100% matter, the value of H_0 could be no bigger than 50 km/s/Mpc to get a Universe that old,
    Galaxies and clusters of galaxies showed strong evidence that there was lots of dark matter,
    X-ray observations from clusters showed that only 10–20% of the matter could be normal matter,
    The large-scale structure of the Universe (correlations between galaxies on hundreds-of-millions of light year scales) showed you need more mass than normal matter could provide,
    but the deep source counts, which depend on the Universe’s volume and how that changes over time, showed that 100% matter was far too much,
    Gravitational lensing was starting to “weigh” these galaxy clusters, and found that only about 30% of the critical density was total matter,
    and Big Bang Nucleosynthesis really seemed to favor a Universe where just ~1/6th of the matter density was normal matter.

    So what was the solution?

    9
    The mass distribution of cluster Abell 370. reconstructed through gravitational lensing, shows two large, diffuse halos of mass, consistent with dark matter with two merging clusters to create what we see here. Around and through every galaxy, cluster, and massive collection of normal matter exists 5 times as much dark matter, overall. This still isn’t enough to reach the critical density, or anywhere close to it, on its own. (NASA, ESA, D. Harvey (École Polytechnique Fédérale de Lausanne, Switzerland), R. Massey (Durham University, UK), the Hubble SM4 ERO Team and ST-ECF)

    Gravitational Lensing NASA/ESA

    Most astronomers had accepted dark matter by this time, but even a Universe that was made exclusively of dark and normal matter would still be problematic. It simply wasn’t old enough for the stars in it! Two pieces of evidence in the late 1990s that came together gave us the way forward. One was the cosmic microwave background, which showed us that the Universe was spatially flat, and therefore the total amount of stuff in there added up to 100%. Yet it couldn’t all be matter, even a mix of normal and dark matter! The other piece of evidence was supernova data, which showed that there was a component in the Universe causing it to accelerate: this must be dark energy. Looking at the multiple lines of evidence even today, they all point to that exact picture.

    11

    Constraints on dark energy from three independent sources: supernovae, the CMB, and BAO (which are a feature in the Universe’s large-scale structure). Note that even without supernovae, we’d need dark energy, and that only 1/6th of the matter found can be normal matter; the rest must be dark matter. (Supernova Cosmology Project, Amanullah, et al., Ap.J. (2010))

    So either you have all these independent lines of evidence, all pointing towards the same picture: General Relativity is our theory of gravity, and our Universe is 13.8 billion years old, with ~70% dark energy, ~30% total matter, where about 5% is normal matter and 25% is dark matter. There are photons and neutrinos which were important in the past, but they’re just a small fraction-of-a-percent by today. As even greater evidence has come in — small-scale fluctuations in the cosmic microwave background, the baryon oscillations in the large-scale structure of the Universe, high-redshift quasars and gamma-ray bursts — this picture remains unchanged. Everything we observe on all scales points to it.

    12
    The farther away we look, the closer in time we’re seeing towards the Big Bang. The newest record-holder for quasars comes from a time when the Universe was just 690 million years old. These ultra-distant cosmological probes also show us a Universe that contains dark matter and dark energy. (Jinyi Yang, University of Arizona; Reidar Hahn, Fermilab; M. Newhouse NOAO/AURA/NSF)

    It wasn’t always apparent that this would be the solution, but this one solution works for literally all the observations. When someone puts forth the hypothesis that “dark matter and/or dark energy doesn’t exist,” the onus is on them to answer the implicit question, “okay, then what replaces General Relativity as your theory of gravity to explain the entire Universe?” As gravitational wave astronomy has further confirmed Einstein’s greatest theory even more spectacularly, even many of the fringe alternatives to General Relativity have fallen away. The way it stands now, there are no theories that exist that successfully do away with dark matter and dark energy and still explain everything that we see. Until there are, there are no real alternatives to the modern picture that deserve to be taken seriously.

    13
    A detailed look at the Universe reveals that it’s made of matter and not antimatter, that dark matter and dark energy are required, and that we don’t know the origin of any of these mysteries. However, the fluctuations in the CMB, the formation and correlations between large-scale structure, and modern observations of gravitational lensing, among many others, all point towards the same picture.(Chris Blake and Sam Moorfield)

    It might not feel right to you, in your gut, that 95% of the Universe would be dark. It might not seem like it’s a reasonable possibility when all you’d need to do, in principle, is to replace your underlying laws with new ones. But until those laws are found, and it hasn’t even been shown that they could mathematically exist, you absolutely have to go with the description of the Universe that all the evidence points to. Anything else is simply an unscientific conclusion.

    And, here we are:

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: