Tagged: Dark Energy Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:41 pm on May 18, 2017 Permalink | Reply
    Tags: , , , Dark Energy, , , ,   

    From Nautilus: “The Physicist Who Denies Dark Matter” Revised and Improved from post of 2017/03/01 

    Nautilus

    Nautilus

    May 18, 2017
    Oded Carmeli

    4
    Mordehai Milgrom. Cosmos on Nautilus

    Maybe Newtonian physics doesn’t need dark matter to work.

    He is one of those dark matter people,” Mordehai Milgrom said about a colleague stopping by his office at the Weizmann Institute of Science. Milgrom introduced us, telling me that his friend is searching for evidence of dark matter in a project taking place just down the hall.

    “There are no ‘dark matter people’ and ‘MOND people,’ ” his colleague retorted.

    7
    http://www.astro.umd.edu/~ssm/mond/

    “I am ‘MOND people,’” Milgrom proudly proclaimed, referring to Modified Newtonian Dynamics, his theory that fixes Newtonian physics instead of postulating the existence of dark matter and dark energy—two things that, according to the standard model of cosmology, constitute 95.1 percent of the total mass-energy content of the universe.

    This friendly incident is indicative of (“Moti”) Milgrom’s calmly quixotic character. There is something almost misleading about the 70-year-old physicist wearing shorts in the hot Israeli summer, whose soft voice breaks whenever he gets excited. Nothing about his pleasant demeanor reveals that this man claims to be the third person to correct Newtonian physics: First Max Planck (with quantum theory), then Einstein (with relativity), now Milgrom.

    This year marks Milgrom’s 50th year at the Weizmann.


    Weizmann Institute Campus

    I visited him there to learn more about how it feels to be a science maverick, what he appreciates about Thomas Kuhn’s The Structure of Scientific Revolutions, and why he thinks dark matter and dark energy don’t exist.

    1
    NASA

    What inspired you to dedicate your life to the motion of stars?

    I remember very vividly the way physics struck me. I was 16 and I thought: Here is a way to understand how things work, far beyond the understanding of my peers. It wasn’t a long-term plan. It was a daily attraction. I simply loved physics, the same way other people love art or sports. I never dreamed of one day making a major discovery, like correcting Newton.

    I had a terrific physics teacher at school, but when you study textbook material, you’re studying done deals. You still don’t see the effort that goes into making breakthrough science, when things are unclear and advances are made intuitively and often go wrong. They don’t teach you that at school. They teach you that science always goes forward: You have a body of knowledge, and then someone discovers something and expands that body of knowledge. But it doesn’t really work that way. The progress of science is never linear.

    How did you get involved with the problem of dark matter?

    Toward the end of my Ph.D., the physics department here wanted to expand. So they asked three top Ph.D. students working on particle physics to choose a new field. We chose astrophysics, and the Weizmann Institute pulled some strings with institutions abroad so they would accept us as postdocs. And so I went to Cornell to fill my gaps in astrophysics.

    After a few years in high energy astrophysics, working on the physics of X-ray radiation in space, I decided to move to yet another field: The dynamics of galaxies. It was a few years after the first detailed measurements of the speed of stars orbiting spiral galaxies came in. And, well, there was a problem with the measurements.

    To understand this problem, one needs to wrap one’s head around some celestial rotations. Our planet orbits the sun, which, in turn, orbits the center of the Milky Way galaxy. Inside solar systems, the gravitational pull from the mass of the sun and the speed of the planets are in balance. By Newton’s laws, this is why Mercury, the innermost planet in our solar system, orbits the sun at over 100,000 miles per hour, while the outermost plant, Neptune, is crawling at just over 10,000 miles per hour.

    Milky Way NASA/JPL-Caltech /ESO R. Hurt

    Now, you might assume that the same logic would apply to galaxies: The farther away the star is from the galaxy’s center, the slower it revolves around it; however, while at smaller radiuses the measurements were as predicted by Newtonian physics, farther stars proved to move much faster than predicted from the gravitational pull of the mass we see in these galaxies. The observed gap got a lot wider when, in the late 1970s, radio telescopes were able to detect and measure the cold gas clouds at the outskirts of galaxies. These clouds orbit the galactic center five times farther than the stars, and thus the anomaly grew to become a major scientific puzzle.

    One way to solve this puzzle is to simply add more matter. If there is too little visible mass at the center of galaxies to account for the speed of stars and gas, perhaps there is more matter than meets the eye, matter that we cannot see, dark matter.

    What made you first question the very existence of dark matter?

    What struck me was some regularity in the anomaly. The rotational velocities were not just larger than expected, they became constant with radius. Why? Sure, if there was dark matter, the speed of stars would be greater, but the rotation curves, meaning the rotational speed drawn as a function of the radius, could still go up and down depending on its distribution. But they didn’t. That really struck me as odd. So, in 1980, I went on my Sabbatical in the Institute for Advance Studies in Princeton with the following hunch: If the rotational speeds are constant, then perhaps we’re looking at a new law of nature. If Newtonian physics can’t predict the fixed curves, perhaps we should fix Newton, instead of making up a whole new class of matter just to fit our measurements.

    If you’re going to change the laws of nature that work so well in our own solar system, you need to find a property that differentiates solar systems from galaxies. So I made up a chart of different properties, such as size, mass, speed of rotation, etc. For each parameter, I put in the Earth, the solar system and some galaxies. For example, galaxies are bigger than solar systems, so perhaps Newton’s laws don’t work over large distances? But if this was the case, you would expect the rotation anomaly to grow bigger in bigger galaxies, while, in fact, it is not. So I crossed that one out and moved on to the next properties.

    I finally struck gold with acceleration: The pace at which the velocity of objects changes.

    3
    NASA

    We usually think of earthbound cars that accelerate in the same direction, but imagine a merry-go-round. You could be going in circles and still accelerate. Otherwise, you would simply fall off. The same goes for celestial merry-go-rounds. And it’s in acceleration that we find a big difference in scales, one that justifies modifying Newton: The normal acceleration for a star orbiting the center of a galaxy is about a hundred million times smaller than that of the Earth orbiting the sun.

    For those small accelerations, MOND introduces a new constant of nature, called a0. If you studied physics in high school, you probably remember Newton’s second law: force equals mass times acceleration, or F=ma. While this is a perfectly good tool when dealing with accelerations much greater than a0, such as those of the planets around our sun, I suggested that at significantly lower accelerations, lower even than that of our sun around the galactic center, force becomes proportional to the square of the acceleration, or F=ma2/a0.

    To put it in other words: According to Newton’s laws, the rotation speed of stars around galactic centers should decrease the farther the star is from the center of mass. If MOND is correct, it should reach a constant value, thus eliminating the need for dark matter.

    What did your colleagues at Princeton think about all this?

    I didn’t share these thoughts with my colleagues at Princeton. I was afraid to come across as, well, crazy. And then, in 1981, when I already had a clear idea of MOND, I didn’t want anyone to jump on my wagon, so to speak, which is even crazier when you think about it. Needless to say [laughs] no one jumped on my wagon, even when I desperately wanted them to.

    Well, you were 35 and you proposed to fix Newton.

    Why not? What’s the big deal? If something doesn’t work, fix it. I wasn’t trying to be bold. I was very naïve at the time. I didn’t understand that scientists are just as swayed as other people by conventions and interests.

    Like Thomas Kuhn’s The Structure of Scientific Revolutions.

    10

    I love that book. I read it several times. It showed me how my life’s story has happened to so many others scientists throughout history. Sure, it’s easy to make fun of people who once objected to what we now know is good science, but are we any different? Kuhn stresses that these objectors are usually good scientists with good reasons to object. It is just that the dissenters usually have a unique point of view of things that is not shared by most others. I laugh about it now, because MOND has made such progress, but there were times when I felt depressed and isolated.

    What’s it like being a science maverick?

    By and large, the last 35 years have been exciting and rewarding exactly because I have been advocating a maverick paradigm. I am a loner by nature, and despite the daunting and doubting times, I much prefer this to being carried with the general flow. I was quite confident in the basic validity of MOND from the very start, which helped me a lot in taking all this in stride, but there are two great advantages to the lingering opposition to MOND: Firstly, it gave me time to make more contributions to MOND than I would had the community jumped on the MOND wagon early on. Secondly, once MOND is accepted, the long and wide resistance to it will only have proven how nontrivial an idea it is.

    By the end of my sabbatical in Princeton, I had secretly written three papers introducing MOND to the world. Publishing them, however, was a whole different story. At first I sent my kernel paper to journals such as Nature and Astrophysical Journal Letters, and it got rejected almost off-hand. It took a long time until all three papers were published, side by side, in Astrophysical Journal.

    The first person to hear about MOND was my wife Yvonne. Frankly, tears come to my eyes when I say this. Yvonne is not a scientist, but she has been my greatest supporter.

    The first scientist to back MOND was another physics maverick: The late Professor Jacob Bekenstein, who was the first to suggest that black holes should have a well-defined entropy, later dubbed the Bekenstein-Hawking entropy. After I submitted the initial MOND trilogy, I sent the preprints to several astrophysicists, but Jacob was the first scientist I discussed MOND with. He was enthusiastic and encouraging from the very start.

    Slowly but surely, this tiny opposition to dark matter grew from just two physicists to several hundred proponents, or at least scientists who take MOND seriously. Dark matter is still the scientific consensus, but MOND is now a formidable opponent that proclaims the emperor has no clothes, that dark matter is our generation’s ether.

    So what happened? As far as dark matter is concerned, nothing really. A host of experiments searching for dark matter, including the Large Hadron Collider, many underground experiments and several space missions, have failed to directly observe its very existence. Meanwhile, MOND was able to accurately predict the rotation of more and more spiral galaxies—over 150 galaxies to date, to be precise.

    All of them? Some papers claim that MOND wasn’t able to predict the dynamics of certain galaxies.

    That’s true and it’s perfectly fine, because MOND’s predictions are based on measurements. Given the distribution of regular, visible matter alone, MOND can predict the dynamics of galaxies. But that prediction is based on our initial measurements. We measure the light coming in from a galaxy to calculate its mass, but we often don’t know the distance to that galaxy for sure, so we don’t know for certain just how massive that galaxy really is. And there are other variables, such as molecular gas, that we can’t observe at all. So yes, some galaxies don’t perfectly match MOND’s predictions, but all in all, it’s almost a miracle that we have enough data on galaxies to prove MOND right, over and over again.

    Your opponents say MOND’s greatest flaw is its incompatibility with relativistic physics.

    In 2004, Bekenstein proposed his TeVeS, or Relativistic Gravitational Theory for MOND.

    12
    http://astroweb.case.edu/ssm/mond/

    Since then, several different relativistic MOND formulations have been put forth, including one by me, called Bimetric MOND, or BIMOND.

    So, no, incorporating MOND into Einsteinian physics is no longer a challenge. I hear this statement still made, but only from people who parrot others, who themselves are not abreast with the developments of the last 10 years. There are several relativistic versions of MOND. What remains a challenge is demonstrating that MOND can account for the mass anomalies in cosmology.

    Another argument that cosmologists often make is that dark matter is needed not just for motion within galaxies, but on even larger scales. What does MOND have to say about that?

    According to the Big Bang theory, the universe began as a uniform singularity 13.8 billion years ago. And, just as in galaxies, observations made of the cosmic background radiation from the early universe suggest that the gravity of all the matter in the universe is simply not enough to form the different patterns we currently see, like galaxies and stars, in just 13.8 billion years. Once again, dark matter was called to the rescue: It does not emit radiation, but it does engage visible material with gravitation. And so, starting from the 1980s, the new cosmological dogma was that dark matter constituted a staggering 95 percent of all matter in the universe. That lasted, well, right until the bomb hit us in 1998.

    It turned out that the expansion of the universe is accelerating, not decelerating like all of us originally thought.

    13
    Timeline of the universe, assuming a cosmological constant. Coldcreation/wikimedia, CC BY-SA

    Any form of genuine matter, dark or not, should have slowed down acceleration. And so a whole new type of entity was invented: Dark energy. Now the accepted cosmology is that the universe is made up of 70 percent dark energy, 25 percent dark matter, and 5 percent regular matter..

    Dark energy depiction. Image: Volker Springle/Max Planck Institute for Astrophysics/SP)

    But dark energy is just a quick fix, the same as dark matter is. And just as in galaxies, you can either invent a whole new type of energy and then spend years trying to understand its properties, or you can try fixing your theory.

    Among other things, MOND points to a very deep connection between structure and dynamics in galaxies and cosmology. This is not expected in accepted physics. Galaxies are tiny structures within the grand scale of the universe, and those structures can behave differently without contradicting the current cosmological consensus. However, MOND creates this connection, binding the two.

    This connection is surprising: For whatever reason, the MOND constant of a0 is close to the acceleration that characterizes the universe itself. In fact, MOND’s constant equals the speed of light squared, divided by the radius of universe.

    So, indeed, to your question, the conundrum pointed to is valid at present. MOND doesn’t have a sufficient cosmology yet, but we’re working on it. And once we fully understand MOND, I believe we’ll also fully understand the expansion of the universe, and vice versa: A new cosmological theory would explain MOND. Wouldn’t that be amazing?

    What do you think about the proposed unified theories of physics, which merge MOND with quantum mechanics?

    These all hark back to my 1999 paper on MOND as a vacuum effect, where it was pointed out that the quantum vacuum in a universe such as ours may produce MOND behavior within galaxies, with the cosmological constant appearing in the guise of the MOND acceleration constant, a0. But I am greatly gratified to see these propositions put forth, especially because they are made by people outside the traditional MOND community. It is very important that researchers from other backgrounds become interested in MOND and bring new ideas to further our understanding of its origin.

    And what if you had a unified theory of physics that explains everything? What then?

    You know, I’m not a religious person, but I often think about our tiny blue dot, and the painstaking work we physicists do here. Who knows? Perhaps somewhere out there, in one of those galaxies I spent my life researching, there already is a known unified theory of physics, with a variation of MOND built into it. But then I think: So what? We still had fun doing the math. We still had the thrill of trying to wrap our heads around the universe, even if the universe never noticed it at all.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
  • richardmitnick 8:19 am on May 17, 2017 Permalink | Reply
    Tags: , , , , Dark Energy, New Explanation for Dark Energy? Tiny Fluctuations of Time and Space,   

    From Universe Today: “New Explanation for Dark Energy? Tiny Fluctuations of Time and Space” 

    universe-today

    Universe Today

    16 May , 2017
    Matt Williams

    1
    A new study from researchers from the University of British Columbia offers a new explanation of Dark Energy. Credit: NASA

    Since the late 1920s, astronomers have been aware of the fact that the Universe is in a state of expansion. Initially predicted by Einstein’s Theory of General Relativity, this realization has gone on to inform the most widely-accepted cosmological model – the Big Bang Theory. However, things became somewhat confusing during the 1990s, when improved observations showed that the Universe’s rate of expansion has been accelerating for billions of years.

    This led to the theory of Dark Energy, a mysterious invisible force that is driving the expansion of the cosmos. Much like Dark Matter which explained the “missing mass”, it then became necessary to find this illusive energy, or at least provide a coherent theoretical framework for it. A new study from the University of British Columbia (UBC) seeks to do just that by postulating the the Universe is expanding due to fluctuations in space and time.

    The study – which was recently published in the journal Physical Review D – was led by Qingdi Wang, a PhD student with the Department of Physics and Astronomy at UBC. Under the supervisions of UBC Professor William Unruh (the man who proposed the Unruh Effect) and with assistance from Zhen Zhu (another PhD student at UBC), they provide a new take on Dark Energy.

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Credit: Alex Mittelmann

    Inflationary Universe. NASA/WMAP

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 12:44 pm on May 9, 2017 Permalink | Reply
    Tags: , , , , Dark Energy, , Detecting infrared light, ESA/Euclid, ,   

    From JPL-Caltech: “NASA Delivers Detectors for ESA’s Euclid Spacecraft” 

    NASA JPL Banner

    JPL-Caltech

    May 9, 2017
    Elizabeth Landau
    Jet Propulsion Laboratory, Pasadena, Calif.
    818-354-6425
    elizabeth.landau@jpl.nasa.gov

    Giuseppe Racca
    Euclid Project Manager
    Directorate of Science
    European Space Agency
    giuseppe.racca@esa.int

    René Laureijs
    Euclid Project Scientist
    Directorate of Science
    European Space Agency
    Rene.Laureijs@esa.int

    ESA/Euclid spacecraft

    Three detector systems for the Euclid mission, led by ESA (European Space Agency), have been delivered to Europe for the spacecraft’s near-infrared instrument. The detector systems are key components of NASA’s contribution to this upcoming mission to study some of the biggest questions about the universe, including those related to the properties and effects of dark matter and dark energy — two critical, but invisible phenomena that scientists think make up the vast majority of our universe.

    “The delivery of these detector systems is a milestone for what we hope will be an extremely exciting mission, the first space mission dedicated to going after the mysterious dark energy,” said Michael Seiffert, the NASA Euclid project scientist based at NASA’s Jet Propulsion Laboratory, Pasadena, California, which manages the development and implementation of the detector systems.

    Euclid will carry two instruments: a visible-light imager (VIS) and a near-infrared spectrometer and photometer (NISP). A special light-splitting plate on the Euclid telescope enables incoming light to be shared by both instruments, so they can carry out observations simultaneously.

    The spacecraft, scheduled for launch in 2020, will observe billions of faint galaxies and investigate why the universe is expanding at an accelerating pace. Astrophysicists think dark energy is responsible for this effect, and Euclid will explore this hypothesis and help constrain dark energy models. This census of distant galaxies will also reveal how galaxies are distributed in our universe, which will help astrophysicists understand how the delicate interplay of the gravity of dark matter, luminous matter and dark energy forms large-scale structures in the universe.

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Additionally, the location of galaxies in relation to each other tells scientists how they are clustered. Dark matter, an invisible substance accounting for over 80 percent of matter in our universe, can cause subtle distortions in the apparent shapes of galaxies. That is because its gravity bends light that travels from a distant galaxy toward an observer, which changes the appearance of the galaxy when it is viewed from a telescope.

    Gravitational Lensing NASA/ESA

    Euclid’s combination of visible and infrared instruments will examine this distortion effect and allow astronomers to probe dark matter and the effects of dark energy.

    Detecting infrared light, which is invisible to the human eye, is especially important for studying the universe’s distant galaxies. Much like the Doppler effect for sound, where a siren’s pitch seems higher as it approaches and lower as it moves away, the frequency of light from an astronomical object gets shifted with motion. Light from objects that are traveling away from us appears redder, and light from those approaching us appears bluer. Because the universe is expanding, distant galaxies are moving away from us, so their light gets stretched out to longer wavelengths. Between 6 and 10 billion light-years away, galaxies are brightest in infrared light.

    JPL procured the NISP detector systems, which were manufactured by Teledyne Imaging Sensors of Camarillo, California. They were tested at JPL and at NASA’s Goddard Space Flight Center, Greenbelt, Maryland, before being shipped to France and the NISP team.

    Each detector system consists of a detector, a cable and a “readout electronics chip” that converts infrared light to data signals read by an onboard computer and transmitted to Earth for analysis. Sixteen detectors will fly on Euclid, each composed of 2040 by 2040 pixels. They will cover a field of view slightly larger than twice the area covered by a full moon. The detectors are made of a mercury-cadmium-telluride mixture and are designed to operate at extremely cold temperatures.

    “The U.S. Euclid team has overcome many technical hurdles along the way, and we are delivering superb detectors that will enable the collection of unprecedented data during the mission,” said Ulf Israelsson, the NASA Euclid project manager, based at JPL.

    Delivery to ESA of the next set of detectors for NISP is planned in early June. The Centre de Physique de Particules de Marseille, France, will provide further characterization of the detector systems. The final detector focal plane will then be assembled at the Laboratoire d’Astrophysique de Marseille, and integrated with the rest of NISP for instrument tests.

    For more information about Euclid, visit:

    http://sci.esa.int/Euclid

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA JPL Campus

    Jet Propulsion Laboratory (JPL) is a federally funded research and development center and NASA field center located in the San Gabriel Valley area of Los Angeles County, California, United States. Although the facility has a Pasadena postal address, it is actually headquartered in the city of La Cañada Flintridge [1], on the northwest border of Pasadena. JPL is managed by the nearby California Institute of Technology (Caltech) for the National Aeronautics and Space Administration. The Laboratory’s primary function is the construction and operation of robotic planetary spacecraft, though it also conducts Earth-orbit and astronomy missions. It is also responsible for operating NASA’s Deep Space Network.

    Caltech Logo

    NASA image

     
  • richardmitnick 12:58 pm on April 18, 2017 Permalink | Reply
    Tags: , , , , Dark Energy, ,   

    From EarthSky: “Who needs dark energy?” 

    1

    EarthSky

    April 17, 2017
    Brian Koberlein

    Dark energy is thought to be the driver for the expansion of the universe. But do we need dark energy to account for an expanding universe?

    1
    Image via Brian Koberlein/ One Universe at a Time.

    Our universe is expanding. We’ve known this for nearly a century, and modern observations continue to support this. Not only is our universe expanding, it is doing so at an ever-increasing rate. But the question remains as to what drives this cosmic expansion. The most popular answer is what we call dark energy. But do we need dark energy to account for an expanding universe? Perhaps not.

    The idea of dark energy comes from a property of general relativity known as the cosmological constant. The basic idea of general relativity is that the presence of matter https://briankoberlein.com/2013/09/09/the-attraction-of-curves/. As a result, light and matter are deflected from simple straight paths in a way that resembles a gravitational force. The simplest mathematical model in relativity just describes this connection between matter and curvature, but it turns out that the equations also allow for an extra parameter, the cosmological constant, that can give space an overall rate of expansion. The cosmological constant perfectly describes the observed properties of dark energy, and it arises naturally in general relativity, so it’s a reasonable model to adopt.

    In classical relativity, the presence of a cosmological constant simply means that cosmic expansion is just a property of spacetime. But our universe is also governed by the quantum theory, and the quantum world doesn’t play well with the cosmological constant. One solution to this issue is that quantum vacuum energy might be driving cosmic expansion, but in quantum theory vacuum fluctuations would probably make the cosmological constant far larger than what we observe, so it isn’t a very satisfactory answer.

    Despite the unexplainable weirdness of dark energy, it matches observations so well that it has become part of the concordance model for cosmology, also known as the Lambda-CDM model. Here the Greek letter Lambda is the symbol for dark energy, and CDM stands for Cold Dark Matter.

    In this model there is a simple way to describe the overall shape of the cosmos, known as the Friedmann–Lemaître–Robertson–Walker (FLRW) metric. The only catch is that this assumes matter is distributed evenly throughout the universe. In the real universe matter is clumped together into clusters of galaxies, so the FLRW metric is only an approximation to the real shape of the universe. Since dark energy makes up about 70% of the mass/energy of the universe, the FLRW metric is generally thought to be a good approximation. But what if it isn’t?

    A new paper argues just that. Since matter clumps together, space would be more highly curved in those regions. In the large voids between the clusters of galaxies, there would be less space curvature. Relative to the clustered regions, the voids would appear to be expanding similarly to the appearance of dark energy. Using this idea the team ran computer simulations of a universe using this cluster effect rather than dark energy. They found that the overall structure evolved similarly to dark energy models.

    That would seem to support the idea that dark energy might be an effect of clustered galaxies.

    It’s an interesting idea, but there are reasons to be skeptical. While such clustering can have some effect on cosmic expansion, it wouldn’t be nearly as strong as we observe. While this particular model seems to explain the scale at which the clustering of galaxies occur, it doesn’t explain other effects, such as observations of distant supernovae which strongly support dark energy. Personally, I don’t find this new model very convincing, but I think ideas like this are certainly worth exploring. If the model can be further refined, it could be worth another look.

    Paper: Gabor Rácz, et al. Concordance cosmology without dark energy. Monthly Notices of the Royal Astronomical Society Letters: DOI: 10.1093/mnrasl/slx026 (2017)


    Dark Energy Camera [DECam], built at FNAL

    DECam at Cerro Tololo, Chile, housing DECam

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 9:10 am on March 30, 2017 Permalink | Reply
    Tags: , , , , Dark Energy,   

    From RAS: “Explaining the accelerating expansion of the universe without dark energy” 

    Royal Astronomical Society

    Royal Astronomical Society

    30 March 2017

    Enigmatic ‘dark energy’, thought to make up 68% of the universe, may not exist at all, according to a Hungarian-American team. The researchers believe that standard models of the universe fail to take account of its changing structure, but that once this is done the need for dark energy disappears. The team publish their results in a paper in Monthly Notices of the Royal Astronomical Society.

    1
    A still from an animation that shows the expansion of the universe in the standard ‘Lambda Cold Dark Matter’ cosmology, which includes dark energy (top left panel, red), the new Avera model, that considers the structure of the universe and eliminates the need for dark energy (top middle panel, blue), and the Einstein-de Sitter cosmology, the original model without dark energy (top right panel, green). The panel at the bottom shows the increase of the ‘scale factor’ (an indication of the size) as a function of time, where 1Gya is 1 billion years. The growth of structure can also be seen in the top panels. One dot roughly represents an entire galaxy cluster. Units of scale are in Megaparsecs (Mpc), where 1 Mpc is around 3 million million million km. Credit: István Csabai et al.

    Our universe was formed in the Big Bang, 13.8 billion years ago, and has been expanding ever since. The key piece of evidence for this expansion is Hubble’s law, based on observations of galaxies, which states that on average, the speed with which a galaxy moves away from us is proportional to its distance.

    Astronomers measure this velocity of recession by looking at lines in the spectrum of a galaxy, which shift more towards red the faster the galaxy is moving away. From the 1920s, mapping the velocities of galaxies led scientists to conclude that the whole universe is expanding, and that it began life as a vanishingly small point.

    In the second half of the twentieth century, astronomers found evidence for unseen ‘dark matter’ by observing that something extra was needed to explain the motion of stars within galaxies. Dark matter is now thought to make up 27% of the content of universe (in contrast ‘ordinary’ matter amounts to only 5%).

    Observations of the explosions of white dwarf stars in binary systems, so-called Type Ia supernovae, in the 1990s then led scientists to the conclusion that a third component, dark energy, made up 68% of the cosmos, and is responsible for driving an acceleration in the expansion of the universe.

    In the new work, the researchers, led by Phd student Gábor Rácz of Eötvös Loránd University in Hungary, question the existence of dark energy and suggest an alternative explanation. They argue that conventional models of cosmology (the study of the origin and evolution of the universe), rely on approximations that ignore its structure, and where matter is assumed to have a uniform density.

    “Einstein’s equations of general relativity that describe the expansion of the universe are so complex mathematically, that for a hundred years no solutions accounting for the effect of cosmic structures have been found. We know from very precise supernova observations that the universe is accelerating, but at the same time we rely on coarse approximations to Einstein’s equations which may introduce serious side-effects, such as the need for dark energy, in the models designed to fit the observational data.” explains Dr László Dobos, co-author of the paper, also at Eötvös Loránd University.

    In practice, normal and dark matter appear to fill the universe with a foam-like structure, where galaxies are located on the thin walls between bubbles, and are grouped into superclusters. The insides of the bubbles are in contrast almost empty of both kinds of matter.

    Using a computer simulation to model the effect of gravity on the distribution of millions of particles of dark matter, the scientists reconstructed the evolution of the universe, including the early clumping of matter, and the formation of large scale structure.

    Unlike conventional simulations with a smoothly expanding universe, taking the structure into account led to a model where different regions of the cosmos expand at different rate. The average expansion rate though is consistent with present observations, which suggest an overall acceleration.

    Dr Dobos adds: “The theory of general relativity is fundamental in understanding the way the universe evolves. We do not question its validity; we question the validity of the approximate solutions. Our findings rely on a mathematical conjecture which permits the differential expansion of space, consistent with general relativity, and they show how the formation of complex structures of matter affects the expansion. These issues were previously swept under the rug but taking them into account can explain the acceleration without the need for dark energy.”

    If this finding is upheld, it could have a significant impact on models of the universe and the direction of research in physics. For the past 20 years, astronomers and theoretical physicists have speculated on the nature of dark energy, but it remains an unsolved mystery. With the new model, Csabai and his collaborators expect at the very least to start a lively debate.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Royal Astronomical Society (RAS), founded in 1820, encourages and promotes the study of astronomy, solar-system science, geophysics and closely related branches of science.

     
  • richardmitnick 2:39 pm on March 24, 2017 Permalink | Reply
    Tags: , , , , Dark Energy, , , ,   

    From WIRED: “Astronomers Don’t Point This Telescope—The Telescope Points Them” 

    Wired logo

    WIRED

    03.23.17
    Sarah Scoles

    1
    U Texas Austin McDonald Observatory Hobby-Eberly Telescope

    The hills of West Texas rise in waves around the Hobby-Eberly Telescope, a powerful instrument encased in a dome that looks like the Epcot ball. Soon, it will become more powerful still: Scientists recently primed the telescope to find evidence of dark energy in the early universe, prying open its eye so it can see and process a wide swath of sky. On April 8, scientists will dedicate the new telescope, capping off the $40 million upgrade and beginning the real work.

    The dark energy experiment, called Hetdex, isn’t how astronomy has traditionally been done. In the classical model, a lone astronomer goes to a mountaintop and solemnly points a telescope at one predetermined object. But Hetdex won’t look for any objects in particular; it will just scan the sky and churn petabytes of the resulting data through a silicon visual cortex. That’s only possible because of today’s steroidal computers, which let scientists analyze, store, and send such massive quantities of data.

    “Dark energy is not only terribly important for astronomy, it’s the central problem for physics. It’s been the bone in our throat for a long time.”

    Steven Weinberg
    Nobel Laureate
    University of Texas at Austin

    The hope is so-called blind surveys like this one will find stuff astronomers never even knew to look for. In this realm, computers take over curation of the sky, telling astronomers what is interesting and worthy of further study, rather than the other way around. These wide-eyed projects are becoming a standard part of astronomers’ arsenal, and the greatest part about them is that their best discoveries are still totally TBD.

    Big Sky Country

    To understand dark energy—that mysterious stuff that pulls the taffy of spacetime—the Hetdex team needed Hobby-Eberly to study one million galaxies 9-11 billion light-years away as they fly away from Earth. To get that many galaxies in a reasonable amount of time, they broadened the view of its 91 tessellated stop-sign-shaped mirrors by 100. They also created an instrument called Virus, with 35,000 optical fibers that send the light from the universe to a spectrograph, which splits it up into constituent wavelengths. All that data can determine both how far away a galaxy is and how fast it’s traveling away from Earth.

    But when a telescope takes a ton of data down from the sky, scientists can also uncover the unexpected. Hetdex’s astronomers will find more than just the stretch marks of dark energy. They’ll discover things about supermassive black holes, star formation, dark matter, and the ages of stars in nearby galaxies.

    The classical method still has advantages; if you know exactly what you want to look at, you write up a nice proposal to Hubble and explain why a fixed gaze at the Whirlpool Galaxy would yield significant results. “But what you see is what you get,” says astronomer Douglas Hudgins. “This is an object, and the science of that object is what you’re stuck with.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 9:16 am on February 22, 2017 Permalink | Reply
    Tags: , , , Dark Energy, , Paul Sutter,   

    From CBS: “When the lights went out in the universe” 

    CBS News

    CBS News

    February 21, 2017
    Paul Sutter

    1
    Astronomers think that the expansion of the universe is regulated by both the force of gravity, and a mysterious dark energy. In this artist’s conception, dark energy is represented by the purple grid above, and gravity by the green grid below.
    NASA/JPL-Caltech

    About 5 billion years ago, everything changed. The expansion of the universe, which had been gradually decelerating for billions of years, reversed course and entered into a period of unbridled acceleration. (It was sort of like a car that switches from decelerating to accelerating, but is still moving forward the whole time.) The unhurried, deliberate process of structure formation — the gradual buildup of ever-larger assemblies of matter from galaxies to groups to clusters — froze and began to undo itself.

    Map of voids and superclusters within 500 million light years from Milky Way 8/11/09 http://www.atlasoftheuniverse.com/nearsc.html  Richard Powell
    Map of voids and superclusters within 500 million light years from Milky Way 8/11/09 http://www.atlasoftheuniverse.com/nearsc.html Richard Powell

    Five billion years ago, a mysterious force overtook the universe. Hidden in the shadows, it lay dormant, buried underneath fields of matter and radiation. But once it uncovered itself, it worked quickly, bending the entire cosmos to its will.

    Five billion years ago, dark energy awoke.

    The guts of the universe

    To explain what’s going on in this overly dramatic telling of the emergence of dark energy, we need to talk about what the universe is made of and how that affects its expansion.

    Let’s start with the mantra of general relativity: mass and energy tell space-time how to bend, and the bending of space-time tells objects how to move. Usually, we think of this as a local interaction, used to explain the orbits of particular planets or the unusual properties of a black hole.

    But those same mathematics of relativity — which provide the needed accuracy for GPS satellites to tell you how close you are to your coffee fix — also serve as the foundation for understanding the growth and evolution of the entire universe. I mean, it is “general” relativity after all.

    The universe is made of all sorts of stuff, and the properties of that stuff influence the overall curvature of the entire cosmos, which impacts its expansion. It’s the mantra of relativity writ large: the mass and energy of the entire universe is bending the spacetime of the entire universe, which is telling the entire universe how to move.

    If the total density of all the stuff is greater than a very specific value — called “the critical density” and equal to about 4 hydrogen atoms per cubic meter — then the universe’s expansion will slow down, stop and reverse in a Big Crunch. If the universe’s density is less than this critical value, the universe will expand forever. And if it’s exactly equal to the critical value, then the universe will expand forever, but at an ever-diminishing rate.

    Measurements suggest that we live in a contradictory universe, where the total density exactly equals the critical density — but the universe’s expansion is still accelerating as if the density was too low.

    What in Hubble’s ghost is going on?

    An empty argument

    What’s going on is dark energy. Totaling 69.2 percent of the energy density of the universe, it simply behaves … strangely. Dark energy’s most important property is that its density is constant. Its second most important property is that it appears to be tied to the vacuum of empty space.

    Dark Energy Icon
    Dark Energy Camera. Built at FNAL
    Dark Energy Camera. Built at FNAL
    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile
    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile

    Take a box, and empty out everything, removing all the matter (regular and dark), neutrinos, radiation … everything. If you did it right, you’ll have a box of pure, unadulterated vacuum — which means you’ll have a box of pure dark energy. Double the size of the box, and you’ll have double the dark energy.

    This behavior is the total opposite of the behavior of matter and radiation. If you have a box (or, say, a universe) with a fixed amount of matter and you double that container’s volume, the density of matter is cut in half. Radiation’s energy density goes down even further: Not only does the expansion of the universe dilute radiation, it also stretches out its wavelength.

    But as the universe expands, we continually get more empty space (vacuum) in it, so we continually get more dark energy. If you’re worried that this violates some sort of principle of conservation of energy, you can rest easy tonight: The universe is a dynamic system, and the form of the conservation laws taught in Physics 101 only apply to static systems. The universe is a dynamic place, and the concept of “conservation of energy” still holds but in a more complex, noninuitive way. But that’s an article for another day.

    You may also be wondering how I can talk so confidently about the nature of dark energy, since we don’t seem to understand it at all. You’re right: We don’t understand dark energy. At all. We know it exists, because we directly observe the accelerated expansion of the universe, and a half-dozen other lines of evidence all point to its existence.

    And while we don’t know what’s creating the accelerated expansion, we do know that we can model it as a property of the vacuum of space that has a constant density, so that’s good enough for now.

    A vacuum and an empty place

    The fact that dark energy has constant density means that in the distant past, it simply didn’t matter — because of matter. All the stuff in the universe was crammed into a smaller volume, which means regular and dark matter had very high densities. This high density meant that for a long time, the expansion of the universe was slowing down.


    The day Dark Energy switched on – Ask a Spaceman! by Paul M. Sutter on YouTube

    But as expansion continued, the matter and radiation in the universe became more and more dilute, and they got less and less dense. Eventually, about 5 billion years ago, the density of matter dropped beneath that of dark energy, which had been holding constant all that time. And once dark energy took over, the game changed completely. Because of the constant nature of its density, compared to the lowering density of matter, expansion not only continued but also accelerated. And that accelerated expansion halted the process of structure formation: Galaxies would love to continue gluing onto each other to form larger structures like clusters and superclusters, but the intervening empty space is inexorably pulling them apart.

    Some chance mergers will continue to happen, of course, but the universe’s days of building larger structures are long over.

    A cosmic coincidence

    The emergence of dark energy leaves us with a little puzzle. In the distant past, when matter densities were incredibly high in a compact universe, dark energy didn’t matter at all. In the distant future, matter will be spread so thin — like too little butter over too much bread — that its density will be ridiculously, hilariously, pathetically small compared to dark energy’s.

    Dark energy depiction. Image: Volker Springle/Max Planck Institute for Astrophysics/SP)
    Dark energy depiction. Image: Volker Springle/Max Planck Institute for Astrophysics/SP


    The surprising coincidence between dark matter and dark energy – Ask a Spaceman! by Paul M. Sutter on YouTube

    Right now, we live in the in-between epoch, where dark energy is roughly three-quarters of the total mass-energy of the universe and dark matter is about one-quarter (regular matter is a negligible amount). This seems a bit … coincidental. Considering the grand history of the universe, we just happen to observe it in the tiny slice of time when matter and dark energy are trading places.

    Did we just happen to get lucky? To arise to consciousness and observe the universe where both dark matter and dark energy are of roughly equal strength? Or is the universe telling us something more? Maybe it’s not a coincidence at all. Maybe dark matter and dark energy “talk” to each other and keep in balance via additional forces of nature; forces that simply don’t manifest in Earthly laboratories. Maybe they’re connected and related.

    Or maybe not. We simply don’t know. It’s a little too dark out there to tell.

    Paul Sutter is an astrophysicist at The Ohio State University and the chief scientist at COSI Science Center. Sutter is also host of Ask a Spaceman, RealSpace and COSI Science Now.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 1:35 pm on December 2, 2016 Permalink | Reply
    Tags: , , Dark Energy, Dark Interactions Workshop,   

    From BNL: “Dark Interactions Workshop Hosts Physicists from Around the World” 

    Brookhaven Lab

    November 23, 2016
    Chelsea Whyte

    Dozens of experimental and theoretical physicists convened at the U.S. Department of Energy’s Brookhaven National Laboratory in October for the second biennial Dark Interactions Workshop. Attendees came from universities and laboratories worldwide to discuss current research and possible future searches for dark sector states such as dark matter.

    1

    Two great cosmic mysteries – dark energy and dark matter — make up nearly 95% of the universe’s energy budget. Dark energy is the proposed agent behind the ever-increasing expansion of the universe. Some force must propel the accelerating rate at which the fabric of space is stretching, but its origin and makeup are still unknown. Dark matter, first proposed over 80 years ago, is theorized to be the mass responsible for most of the immense gravitational pull that galaxy clusters exert. Without its presence, galaxies and galaxy clusters shouldn’t hang together as they do, according to the laws of gravity that permeate our cosmos.

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey
    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Scientists know this much. It’s a bit like a map of a continent with the outlines drawn, but large holes that need a lot of filling in. “There are a lot of things we know that we don’t know,” said Brookhaven physicist Ketevi Assamagan, who organized the workshop along with Brookhaven physicists Hooman Davoudiasl and Mary Bishai, and Stony Brook University physicist Rouven Essig.

    The Dark Interactions Workshop was created to gather great minds in search of answers to these cosmic questions, and to share knowledge across the many different types of experiments searching for dark-sector particles. “The goals are to search for several well-motivated dark-sector particles with existing and upcoming experiments, but also to propose new experiments that can lead the search for dark forces in the coming decade. This requires in-depth discussions among theorists and experimentalists,” Essig said.

    The sessions ranged from discussing theories to status updates from dark-particle searches following the first workshop two years ago. Attendees included post-docs as well as tenured scientists, and Assamagan said workshops like this are crucial for allowing a diverse and somewhat disparate group of scientists in a dense field of study to get to know each other’s work and build collaborations.

    “Dark matter is one of the hot topics in particle and astrophysics today. We know that we don’t have the complete story when it comes to our universe. Understanding the nature of dark matter would be a revolution,” Assamagan said.

    While tantalizing theories have directed physicists to build new ways to search for dark sector states, conclusive evidence still eludes scientists. “Since there is currently a vast range of possibilities for what could constitute the dark sector, a variety of innovative approaches for answering this question need to be considered,” Davoudiasl said. “To that end, meetings like this are quite helpful as they facilitate the exchange of new ideas.”

    “There’s still a lot of hope. Meetings like this one show that there are a lot of clever people working in this field and a lot of collaboration between them. Hopefully at our next workshop, we’ll be sharing evidence that we’ve discovered something of the dark sector,” said Assamagan.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 7:05 pm on November 30, 2016 Permalink | Reply
    Tags: , , Dark Energy, , , , ,   

    From Quanta: “The Case Against Dark Matter” 

    Quanta Magazine
    Quanta Magazine

    November 29, 2016
    Natalie Wolchover

    1
    Erik Verlinde
    Ilvy Njiokiktjien for Quanta Magazine

    For 80 years, scientists have puzzled over the way galaxies and other cosmic structures appear to gravitate toward something they cannot see. This hypothetical “dark matter” seems to outweigh all visible matter by a startling ratio of five to one, suggesting that we barely know our own universe. Thousands of physicists are doggedly searching for these invisible particles.

    But the dark matter hypothesis assumes scientists know how matter in the sky ought to move in the first place. This month, a series of developments has revived a long-disfavored argument that dark matter doesn’t exist after all. In this view, no missing matter is needed to explain the errant motions of the heavenly bodies; rather, on cosmic scales, gravity itself works in a different way than either Isaac Newton or Albert Einstein predicted.

    The latest attempt to explain away dark matter is a much-discussed proposal by Erik Verlinde, a theoretical physicist at the University of Amsterdam who is known for bold and prescient, if sometimes imperfect, ideas. In a dense 51-page paper posted online on Nov. 7, Verlinde casts gravity as a byproduct of quantum interactions and suggests that the extra gravity attributed to dark matter is an effect of “dark energy” — the background energy woven into the space-time fabric of the universe.

    Instead of hordes of invisible particles, “dark matter is an interplay between ordinary matter and dark energy,” Verlinde said.

    To make his case, Verlinde has adopted a radical perspective on the origin of gravity that is currently in vogue among leading theoretical physicists. Einstein defined gravity as the effect of curves in space-time created by the presence of matter. According to the new approach, gravity is an emergent phenomenon. Space-time and the matter within it are treated as a hologram that arises from an underlying network of quantum bits (called “qubits”), much as the three-dimensional environment of a computer game is encoded in classical bits on a silicon chip. Working within this framework, Verlinde traces dark energy to a property of these underlying qubits that supposedly encode the universe. On large scales in the hologram, he argues, dark energy interacts with matter in just the right way to create the illusion of dark matter.

    In his calculations, Verlinde rediscovered the equations of “modified Newtonian dynamics,” or MOND. This 30-year-old theory makes an ad hoc tweak to the famous “inverse-square” law of gravity in Newton’s and Einstein’s theories in order to explain some of the phenomena attributed to dark matter. That this ugly fix works at all has long puzzled physicists. “I have a way of understanding the MOND success from a more fundamental perspective,” Verlinde said.

    Many experts have called Verlinde’s paper compelling but hard to follow. While it remains to be seen whether his arguments will hold up to scrutiny, the timing is fortuitous. In a new analysis of galaxies published on Nov. 9 in Physical Review Letters, three astrophysicists led by Stacy McGaugh of Case Western Reserve University in Cleveland, Ohio, have strengthened MOND’s case against dark matter.

    The researchers analyzed a diverse set of 153 galaxies, and for each one they compared the rotation speed of visible matter at any given distance from the galaxy’s center with the amount of visible matter contained within that galactic radius. Remarkably, these two variables were tightly linked in all the galaxies by a universal law, dubbed the “radial acceleration relation.” This makes perfect sense in the MOND paradigm, since visible matter is the exclusive source of the gravity driving the galaxy’s rotation (even if that gravity does not take the form prescribed by Newton or Einstein). With such a tight relationship between gravity felt by visible matter and gravity given by visible matter, there would seem to be no room, or need, for dark matter.

    Even as dark matter proponents rise to its defense, a third challenge has materialized. In new research that has been presented at seminars and is under review by the Monthly Notices of the Royal Astronomical Society, a team of Dutch astronomers have conducted what they call the first test of Verlinde’s theory: In comparing his formulas to data from more than 30,000 galaxies, Margot Brouwer of Leiden University in the Netherlands and her colleagues found that Verlinde correctly predicts the gravitational distortion or “lensing” of light from the galaxies — another phenomenon that is normally attributed to dark matter. This is somewhat to be expected, as MOND’s original developer, the Israeli astrophysicist Mordehai Milgrom, showed years ago that MOND accounts for gravitational lensing data. Verlinde’s theory will need to succeed at reproducing dark matter phenomena in cases where the old MOND failed.

    Kathryn Zurek, a dark matter theorist at Lawrence Berkeley National Laboratory, said Verlinde’s proposal at least demonstrates how something like MOND might be right after all. “One of the challenges with modified gravity is that there was no sensible theory that gives rise to this behavior,” she said. “If [Verlinde’s] paper ends up giving that framework, then that by itself could be enough to breathe more life into looking at [MOND] more seriously.”

    The New MOND

    In Newton’s and Einstein’s theories, the gravitational attraction of a massive object drops in proportion to the square of the distance away from it. This means stars orbiting around a galaxy should feel less gravitational pull — and orbit more slowly — the farther they are from the galactic center. Stars’ velocities do drop as predicted by the inverse-square law in the inner galaxy, but instead of continuing to drop as they get farther away, their velocities level off beyond a certain point. The “flattening” of galaxy rotation speeds, discovered by the astronomer Vera Rubin in the 1970s, is widely considered to be Exhibit A in the case for dark matter — explained, in that paradigm, by dark matter clouds or “halos” that surround galaxies and give an extra gravitational acceleration to their outlying stars.

    Searches for dark matter particles have proliferated — with hypothetical “weakly interacting massive particles” (WIMPs) and lighter-weight “axions” serving as prime candidates — but so far, experiments have found nothing.

    2
    Lucy Reading-Ikkanda for Quanta Magazine

    Meanwhile, in the 1970s and 1980s, some researchers, including Milgrom, took a different tack. Many early attempts at tweaking gravity were easy to rule out, but Milgrom found a winning formula: When the gravitational acceleration felt by a star drops below a certain level — precisely 0.00000000012 meters per second per second, or 100 billion times weaker than we feel on the surface of the Earth — he postulated that gravity somehow switches from an inverse-square law to something close to an inverse-distance law. “There’s this magic scale,” McGaugh said. “Above this scale, everything is normal and Newtonian. Below this scale is where things get strange. But the theory does not really specify how you get from one regime to the other.”

    Physicists do not like magic; when other cosmological observations seemed far easier to explain with dark matter than with MOND, they left the approach for dead. Verlinde’s theory revitalizes MOND by attempting to reveal the method behind the magic.

    Verlinde, ruddy and fluffy-haired at 54 and lauded for highly technical string theory calculations, first jotted down a back-of-the-envelope version of his idea in 2010. It built on a famous paper he had written months earlier, in which he boldly declared that gravity does not really exist. By weaving together numerous concepts and conjectures at the vanguard of physics, he had concluded that gravity is an emergent thermodynamic effect, related to increasing entropy (or disorder). Then, as now, experts were uncertain what to make of the paper, though it inspired fruitful discussions.

    The particular brand of emergent gravity in Verlinde’s paper turned out not to be quite right, but he was tapping into the same intuition that led other theorists to develop the modern holographic description of emergent gravity and space-time — an approach that Verlinde has now absorbed into his new work.

    In this framework, bendy, curvy space-time and everything in it is a geometric representation of pure quantum information — that is, data stored in qubits. Unlike classical bits, qubits can exist simultaneously in two states (0 and 1) with varying degrees of probability, and they become “entangled” with each other, such that the state of one qubit determines the state of the other, and vice versa, no matter how far apart they are. Physicists have begun to work out the rules by which the entanglement structure of qubits mathematically translates into an associated space-time geometry. An array of qubits entangled with their nearest neighbors might encode flat space, for instance, while more complicated patterns of entanglement give rise to matter particles such as quarks and electrons, whose mass causes the space-time to be curved, producing gravity. “The best way we understand quantum gravity currently is this holographic approach,” said Mark Van Raamsdonk, a physicist at the University of British Columbia in Vancouver who has done influential work on the subject.

    The mathematical translations are rapidly being worked out for holographic universes with an Escher-esque space-time geometry known as anti-de Sitter (AdS) space, but universes like ours, which have de Sitter geometries, have proved far more difficult. In his new paper, Verlinde speculates that it’s exactly the de Sitter property of our native space-time that leads to the dark matter illusion.

    De Sitter space-times like ours stretch as you look far into the distance. For this to happen, space-time must be infused with a tiny amount of background energy — often called dark energy — which drives space-time apart from itself. Verlinde models dark energy as a thermal energy, as if our universe has been heated to an excited state. (AdS space, by contrast, is like a system in its ground state.) Verlinde associates this thermal energy with long-range entanglement between the underlying qubits, as if they have been shaken up, driving entangled pairs far apart. He argues that this long-range entanglement is disrupted by the presence of matter, which essentially removes dark energy from the region of space-time that it occupied. The dark energy then tries to move back into this space, exerting a kind of elastic response on the matter that is equivalent to a gravitational attraction.

    Because of the long-range nature of the entanglement, the elastic response becomes increasingly important in larger volumes of space-time. Verlinde calculates that it will cause galaxy rotation curves to start deviating from Newton’s inverse-square law at exactly the magic acceleration scale pinpointed by Milgrom in his original MOND theory.

    Van Raamsdonk calls Verlinde’s idea “definitely an important direction.” But he says it’s too soon to tell whether everything in the paper — which draws from quantum information theory, thermodynamics, condensed matter physics, holography and astrophysics — hangs together. Either way, Van Raamsdonk said, “I do find the premise interesting, and feel like the effort to understand whether something like that could be right could be enlightening.”

    One problem, said Brian Swingle of Harvard and Brandeis universities, who also works in holography, is that Verlinde lacks a concrete model universe like the ones researchers can construct in AdS space, giving him more wiggle room for making unproven speculations. “To be fair, we’ve gotten further by working in a more limited context, one which is less relevant for our own gravitational universe,” Swingle said, referring to work in AdS space. “We do need to address universes more like our own, so I hold out some hope that his new paper will provide some additional clues or ideas going forward.”


    Access mp4 video here .

    The Case for Dark Matter

    Verlinde could be capturing the zeitgeist the way his 2010 entropic-gravity paper did. Or he could be flat-out wrong. The question is whether his new and improved MOND can reproduce phenomena that foiled the old MOND and bolstered belief in dark matter.

    One such phenomenon is the Bullet cluster, a galaxy cluster in the process of colliding with another.

    4
    X-ray photo by Chandra X-ray Observatory of the Bullet Cluster (1E0657-56). Exposure time was 0.5 million seconds (~140 hours) and the scale is shown in megaparsecs. Redshift (z) = 0.3, meaning its light has wavelengths stretched by a factor of 1.3. Based on today’s theories this shows the cluster to be about 4 billion light years away.
    In this photograph, a rapidly moving galaxy cluster with a shock wave trailing behind it seems to have hit another cluster at high speed. The gases collide, and gravitational fields of the stars and galalxies interact. When the galaxies collided, based on black-body temperture readings, the temperature reached 160 million degrees and X-rays were emitted in great intensity, claiming title of the hottest known galactic cluster.
    Studies of the Bullet cluster, announced in August 2006, provide the best evidence to date for the existence of dark matter.
    http://cxc.harvard.edu/symposium_2005/proceedings/files/markevitch_maxim.pdf
    User:Mac_Davis

    5
    Superimposed mass density contours, caused by gravitational lensing of dark matter. Photograph taken with Hubble Space Telescope.
    Date 22 August 2006
    http://cxc.harvard.edu/symposium_2005/proceedings/files/markevitch_maxim.pdf
    User:Mac_Davis

    The visible matter in the two clusters crashes together, but gravitational lensing suggests that a large amount of dark matter, which does not interact with visible matter, has passed right through the crash site. Some physicists consider this indisputable proof of dark matter. However, Verlinde thinks his theory will be able to handle the Bullet cluster observations just fine. He says dark energy’s gravitational effect is embedded in space-time and is less deformable than matter itself, which would have allowed the two to separate during the cluster collision.

    But the crowning achievement for Verlinde’s theory would be to account for the suspected imprints of dark matter in the cosmic microwave background (CMB), ancient light that offers a snapshot of the infant universe.

    CMB per ESA/Planck
    CMB per ESA/Planck

    The snapshot reveals the way matter at the time repeatedly contracted due to its gravitational attraction and then expanded due to self-collisions, producing a series of peaks and troughs in the CMB data. Because dark matter does not interact, it would only have contracted without ever expanding, and this would modulate the amplitudes of the CMB peaks in exactly the way that scientists observe. One of the biggest strikes against the old MOND was its failure to predict this modulation and match the peaks’ amplitudes. Verlinde expects that his version will work — once again, because matter and the gravitational effect of dark energy can separate from each other and exhibit different behaviors. “Having said this,” he said, “I have not calculated this all through.”

    While Verlinde confronts these and a handful of other challenges, proponents of the dark matter hypothesis have some explaining of their own to do when it comes to McGaugh and his colleagues’ recent findings about the universal relationship between galaxy rotation speeds and their visible matter content.

    In October, responding to a preprint of the paper by McGaugh and his colleagues, two teams of astrophysicists independently argued that the dark matter hypothesis can account for the observations. They say the amount of dark matter in a galaxy’s halo would have precisely determined the amount of visible matter the galaxy ended up with when it formed. In that case, galaxies’ rotation speeds, even though they’re set by dark matter and visible matter combined, will exactly correlate with either their dark matter content or their visible matter content (since the two are not independent). However, computer simulations of galaxy formation do not currently indicate that galaxies’ dark and visible matter contents will always track each other. Experts are busy tweaking the simulations, but Arthur Kosowsky of the University of Pittsburgh, one of the researchers working on them, says it’s too early to tell if the simulations will be able to match all 153 examples of the universal law in McGaugh and his colleagues’ galaxy data set. If not, then the standard dark matter paradigm is in big trouble. “Obviously this is something that the community needs to look at more carefully,” Zurek said.

    Even if the simulations can be made to match the data, McGaugh, for one, considers it an implausible coincidence that dark matter and visible matter would conspire to exactly mimic the predictions of MOND at every location in every galaxy. “If somebody were to come to you and say, ‘The solar system doesn’t work on an inverse-square law, really it’s an inverse-cube law, but there’s dark matter that’s arranged just so that it always looks inverse-square,’ you would say that person is insane,” he said. “But that’s basically what we’re asking to be the case with dark matter here.”

    Given the considerable indirect evidence and near consensus among physicists that dark matter exists, it still probably does, Zurek said. “That said, you should always check that you’re not on a bandwagon,” she added. “Even though this paradigm explains everything, you should always check that there isn’t something else going on.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 2:40 pm on November 25, 2016 Permalink | Reply
    Tags: Dark Energy, , GridPP, , Shear brilliance: computing tackles the mystery of the dark universe,   

    From U Manchester: “Shear brilliance: computing tackles the mystery of the dark universe” 

    U Manchester bloc

    University of Manchester

    24 November 2016
    No writer credit found

    Scientists from The University of Manchester working on a revolutionary telescope project have harnessed the power of distributed computing from the UK’s GridPP collaboration to tackle one of the Universe’s biggest mysteries – the nature of dark matter and dark energy.

    Researchers at The University of Manchester have used resources provided by GridPP – who represent the UK’s contribution to the computing grid used to find the Higgs boson at CERN – to run image processing and machine learning algorithms on thousands of images of galaxies from the international Dark Energy Survey.

    Dark Energy Icon

    The Manchester team are part of the collaborative project to build the Large Synoptic Survey Telescope (LSST), a new kind of telescope currently under construction in Chile and designed to conduct a 10-year survey of the dynamic Universe. LSST will be able to map the entire visible sky.

    LSST/Camera, built at SLAC
    LSST/Camera, built at SLAC

    LSST Interior
    LSST telescope, currently under construction at Cerro Pachón Chile
    LSST telescope, currently under construction at Cerro Pachón Chile

    In preparation to the LSST starting its revolutionary scanning, a pilot research project has helped researchers detect and map out the cosmic shear seen across the night sky, one of the tell-tale signs of the dark matter and dark energy thought to make up some 95 per cent of what we see in the Universe. This in turn will help prepare for the analysis of the expected 200 petabytes of data the LSST will collect when it starts operating in 2023.

    The pilot research team based at The Manchester of University was led by Dr Joe Zuntz, a cosmologist originally at Manchester’s Jodrell Bank Observatory and now a researcher at the Royal Observatory in Edinburgh.

    “Our overall aim is to tackle the mystery of the dark universe – and this pilot project has been hugely significant. When the LSST is fully operating researchers will face a galactic data deluge – and our work will prepare us for the analytical challenge ahead.”
    Sarah Bridle, Professor of Astrophysics

    Dr George Beckett, the LSST-UK Science Centre Project Manager based at The University of Edinburgh, added: “The pilot has been a great success. Having completed the work, Joe and his colleagues are able to carry out shear analysis on vast image sets much faster than was previously the case. Thanks are due to the members of the GridPP community for their assistance and support throughout.”

    The LSST will produce images of galaxies in a wide variety of frequency bands of the visible electromagnetic spectrum, with each image giving different information about the galaxy’s nature and history. In times gone by, the measurements needed to determine properties like cosmic shear might have been done by hand, or at least with human-supervised computer processing.

    With the billions of galaxies expected to be observed by LSST, such approaches are unfeasible. Specialised image processing and machine learning software (Zuntz 2013) has therefore been developed for use with galaxy images from telescopes like LSST and its predecessors. This can be used to produce cosmic shear maps like those shown in the figure below. The challenge then becomes one of processing and managing the data for hundreds of thousands of galaxies and extracting scientific results required by LSST researchers and the wider astrophysics community.

    As each galaxy is essentially independent of other galaxies in the catalogue, the image processing workflow itself is highly parallelisable. This makes it an ideal problem to tackle with the kind of High-Throughput Computing (HTP) resources and infrastructure offered by GridPP. In many ways, the data from CERN’s Large Hadron Collider particle collision events is like that produced by a digital camera (indeed, pixel-based detectors are used near the interaction points) – and GridPP regularly processes billions of such events as part of the Worldwide LHC Computing Grid (WLCG).

    A pilot exercise, led by Dr Joe Zuntz while at The University of Manchester and supported by one of the longest serving and most experienced GridPP experts, Senior System Administrator Alessandra Forti, saw the porting of the image analysis workflow to GridPP’s distributed computing infrastructure. Data from the Dark Energy Survey (DES) was used for the pilot.

    After transferring this data from the US to GridPP Storage Elements, and enabling the LSST Virtual Organisation on a number of GridPP Tier-2 sites, the IM3SHAPE analysis software package (Zuntz, 2013) was tested on local, grid-friendly client machines to ensure smooth running on the grid. Analysis jobs were then submitted and managed using the Ganga software suite, which is able to coordinate the thousands of individual analyses associated with each batch of galaxies. Initial runs were submitted using Ganga to local grid sites, but the pilot progressed to submission to multiple sites via the GridPP DIRAC (Distributed Infrastructure with Remote Agent Control) service. The flexibility of Ganga allows both types of submission, which made the transition from local to distributed running significantly easier.

    By the end of pilot, Dr Zuntz was able to run the image processing workflow on multiple GridPP sites, regularly submitting thousands of analysis jobs on DES images.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U Manchester campus

    The University of Manchester (UoM) is a public research university in the city of Manchester, England, formed in 2004 by the merger of the University of Manchester Institute of Science and Technology (renamed in 1966, est. 1956 as Manchester College of Science and Technology) which had its ultimate origins in the Mechanics’ Institute established in the city in 1824 and the Victoria University of Manchester founded by charter in 1904 after the dissolution of the federal Victoria University (which also had members in Leeds and Liverpool), but originating in Owens College, founded in Manchester in 1851. The University of Manchester is regarded as a red brick university, and was a product of the civic university movement of the late 19th century. It formed a constituent part of the federal Victoria University between 1880, when it received its royal charter, and 1903–1904, when it was dissolved.

    The University of Manchester is ranked 33rd in the world by QS World University Rankings 2015-16. In the 2015 Academic Ranking of World Universities, Manchester is ranked 41st in the world and 5th in the UK. In an employability ranking published by Emerging in 2015, where CEOs and chairmen were asked to select the top universities which they recruited from, Manchester placed 24th in the world and 5th nationally. The Global Employability University Ranking conducted by THE places Manchester at 27th world-wide and 10th in Europe, ahead of academic powerhouses such as Cornell, UPenn and LSE. It is ranked joint 56th in the world and 18th in Europe in the 2015-16 Times Higher Education World University Rankings. In the 2014 Research Excellence Framework, Manchester came fifth in terms of research power and seventeenth for grade point average quality when including specialist institutions. More students try to gain entry to the University of Manchester than to any other university in the country, with more than 55,000 applications for undergraduate courses in 2014 resulting in 6.5 applicants for every place available. According to the 2015 High Fliers Report, Manchester is the most targeted university by the largest number of leading graduate employers in the UK.

    The university owns and operates major cultural assets such as the Manchester Museum, Whitworth Art Gallery, John Rylands Library and Jodrell Bank Observatory which includes the Grade I listed Lovell Telescope.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: