Tagged: Dark Energy Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:39 pm on March 24, 2017 Permalink | Reply
    Tags: , , , , Dark Energy, , , ,   

    From WIRED: “Astronomers Don’t Point This Telescope—The Telescope Points Them” 

    Wired logo


    Sarah Scoles

    U Texas Austin McDonald Observatory Hobby-Eberly Telescope

    The hills of West Texas rise in waves around the Hobby-Eberly Telescope, a powerful instrument encased in a dome that looks like the Epcot ball. Soon, it will become more powerful still: Scientists recently primed the telescope to find evidence of dark energy in the early universe, prying open its eye so it can see and process a wide swath of sky. On April 8, scientists will dedicate the new telescope, capping off the $40 million upgrade and beginning the real work.

    The dark energy experiment, called Hetdex, isn’t how astronomy has traditionally been done. In the classical model, a lone astronomer goes to a mountaintop and solemnly points a telescope at one predetermined object. But Hetdex won’t look for any objects in particular; it will just scan the sky and churn petabytes of the resulting data through a silicon visual cortex. That’s only possible because of today’s steroidal computers, which let scientists analyze, store, and send such massive quantities of data.

    “Dark energy is not only terribly important for astronomy, it’s the central problem for physics. It’s been the bone in our throat for a long time.”

    Steven Weinberg
    Nobel Laureate
    University of Texas at Austin

    The hope is so-called blind surveys like this one will find stuff astronomers never even knew to look for. In this realm, computers take over curation of the sky, telling astronomers what is interesting and worthy of further study, rather than the other way around. These wide-eyed projects are becoming a standard part of astronomers’ arsenal, and the greatest part about them is that their best discoveries are still totally TBD.

    Big Sky Country

    To understand dark energy—that mysterious stuff that pulls the taffy of spacetime—the Hetdex team needed Hobby-Eberly to study one million galaxies 9-11 billion light-years away as they fly away from Earth. To get that many galaxies in a reasonable amount of time, they broadened the view of its 91 tessellated stop-sign-shaped mirrors by 100. They also created an instrument called Virus, with 35,000 optical fibers that send the light from the universe to a spectrograph, which splits it up into constituent wavelengths. All that data can determine both how far away a galaxy is and how fast it’s traveling away from Earth.

    But when a telescope takes a ton of data down from the sky, scientists can also uncover the unexpected. Hetdex’s astronomers will find more than just the stretch marks of dark energy. They’ll discover things about supermassive black holes, star formation, dark matter, and the ages of stars in nearby galaxies.

    The classical method still has advantages; if you know exactly what you want to look at, you write up a nice proposal to Hubble and explain why a fixed gaze at the Whirlpool Galaxy would yield significant results. “But what you see is what you get,” says astronomer Douglas Hudgins. “This is an object, and the science of that object is what you’re stuck with.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 9:16 am on February 22, 2017 Permalink | Reply
    Tags: , , , Dark Energy, , Paul Sutter,   

    From CBS: “When the lights went out in the universe” 

    CBS News

    CBS News

    February 21, 2017
    Paul Sutter

    Astronomers think that the expansion of the universe is regulated by both the force of gravity, and a mysterious dark energy. In this artist’s conception, dark energy is represented by the purple grid above, and gravity by the green grid below.

    About 5 billion years ago, everything changed. The expansion of the universe, which had been gradually decelerating for billions of years, reversed course and entered into a period of unbridled acceleration. (It was sort of like a car that switches from decelerating to accelerating, but is still moving forward the whole time.) The unhurried, deliberate process of structure formation — the gradual buildup of ever-larger assemblies of matter from galaxies to groups to clusters — froze and began to undo itself.

    Map of voids and superclusters within 500 million light years from Milky Way 8/11/09 http://www.atlasoftheuniverse.com/nearsc.html  Richard Powell
    Map of voids and superclusters within 500 million light years from Milky Way 8/11/09 http://www.atlasoftheuniverse.com/nearsc.html Richard Powell

    Five billion years ago, a mysterious force overtook the universe. Hidden in the shadows, it lay dormant, buried underneath fields of matter and radiation. But once it uncovered itself, it worked quickly, bending the entire cosmos to its will.

    Five billion years ago, dark energy awoke.

    The guts of the universe

    To explain what’s going on in this overly dramatic telling of the emergence of dark energy, we need to talk about what the universe is made of and how that affects its expansion.

    Let’s start with the mantra of general relativity: mass and energy tell space-time how to bend, and the bending of space-time tells objects how to move. Usually, we think of this as a local interaction, used to explain the orbits of particular planets or the unusual properties of a black hole.

    But those same mathematics of relativity — which provide the needed accuracy for GPS satellites to tell you how close you are to your coffee fix — also serve as the foundation for understanding the growth and evolution of the entire universe. I mean, it is “general” relativity after all.

    The universe is made of all sorts of stuff, and the properties of that stuff influence the overall curvature of the entire cosmos, which impacts its expansion. It’s the mantra of relativity writ large: the mass and energy of the entire universe is bending the spacetime of the entire universe, which is telling the entire universe how to move.

    If the total density of all the stuff is greater than a very specific value — called “the critical density” and equal to about 4 hydrogen atoms per cubic meter — then the universe’s expansion will slow down, stop and reverse in a Big Crunch. If the universe’s density is less than this critical value, the universe will expand forever. And if it’s exactly equal to the critical value, then the universe will expand forever, but at an ever-diminishing rate.

    Measurements suggest that we live in a contradictory universe, where the total density exactly equals the critical density — but the universe’s expansion is still accelerating as if the density was too low.

    What in Hubble’s ghost is going on?

    An empty argument

    What’s going on is dark energy. Totaling 69.2 percent of the energy density of the universe, it simply behaves … strangely. Dark energy’s most important property is that its density is constant. Its second most important property is that it appears to be tied to the vacuum of empty space.

    Dark Energy Icon
    Dark Energy Camera. Built at FNAL
    Dark Energy Camera. Built at FNAL
    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile
    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile

    Take a box, and empty out everything, removing all the matter (regular and dark), neutrinos, radiation … everything. If you did it right, you’ll have a box of pure, unadulterated vacuum — which means you’ll have a box of pure dark energy. Double the size of the box, and you’ll have double the dark energy.

    This behavior is the total opposite of the behavior of matter and radiation. If you have a box (or, say, a universe) with a fixed amount of matter and you double that container’s volume, the density of matter is cut in half. Radiation’s energy density goes down even further: Not only does the expansion of the universe dilute radiation, it also stretches out its wavelength.

    But as the universe expands, we continually get more empty space (vacuum) in it, so we continually get more dark energy. If you’re worried that this violates some sort of principle of conservation of energy, you can rest easy tonight: The universe is a dynamic system, and the form of the conservation laws taught in Physics 101 only apply to static systems. The universe is a dynamic place, and the concept of “conservation of energy” still holds but in a more complex, noninuitive way. But that’s an article for another day.

    You may also be wondering how I can talk so confidently about the nature of dark energy, since we don’t seem to understand it at all. You’re right: We don’t understand dark energy. At all. We know it exists, because we directly observe the accelerated expansion of the universe, and a half-dozen other lines of evidence all point to its existence.

    And while we don’t know what’s creating the accelerated expansion, we do know that we can model it as a property of the vacuum of space that has a constant density, so that’s good enough for now.

    A vacuum and an empty place

    The fact that dark energy has constant density means that in the distant past, it simply didn’t matter — because of matter. All the stuff in the universe was crammed into a smaller volume, which means regular and dark matter had very high densities. This high density meant that for a long time, the expansion of the universe was slowing down.

    The day Dark Energy switched on – Ask a Spaceman! by Paul M. Sutter on YouTube

    But as expansion continued, the matter and radiation in the universe became more and more dilute, and they got less and less dense. Eventually, about 5 billion years ago, the density of matter dropped beneath that of dark energy, which had been holding constant all that time. And once dark energy took over, the game changed completely. Because of the constant nature of its density, compared to the lowering density of matter, expansion not only continued but also accelerated. And that accelerated expansion halted the process of structure formation: Galaxies would love to continue gluing onto each other to form larger structures like clusters and superclusters, but the intervening empty space is inexorably pulling them apart.

    Some chance mergers will continue to happen, of course, but the universe’s days of building larger structures are long over.

    A cosmic coincidence

    The emergence of dark energy leaves us with a little puzzle. In the distant past, when matter densities were incredibly high in a compact universe, dark energy didn’t matter at all. In the distant future, matter will be spread so thin — like too little butter over too much bread — that its density will be ridiculously, hilariously, pathetically small compared to dark energy’s.

    Dark energy depiction. Image: Volker Springle/Max Planck Institute for Astrophysics/SP)
    Dark energy depiction. Image: Volker Springle/Max Planck Institute for Astrophysics/SP

    The surprising coincidence between dark matter and dark energy – Ask a Spaceman! by Paul M. Sutter on YouTube

    Right now, we live in the in-between epoch, where dark energy is roughly three-quarters of the total mass-energy of the universe and dark matter is about one-quarter (regular matter is a negligible amount). This seems a bit … coincidental. Considering the grand history of the universe, we just happen to observe it in the tiny slice of time when matter and dark energy are trading places.

    Did we just happen to get lucky? To arise to consciousness and observe the universe where both dark matter and dark energy are of roughly equal strength? Or is the universe telling us something more? Maybe it’s not a coincidence at all. Maybe dark matter and dark energy “talk” to each other and keep in balance via additional forces of nature; forces that simply don’t manifest in Earthly laboratories. Maybe they’re connected and related.

    Or maybe not. We simply don’t know. It’s a little too dark out there to tell.

    Paul Sutter is an astrophysicist at The Ohio State University and the chief scientist at COSI Science Center. Sutter is also host of Ask a Spaceman, RealSpace and COSI Science Now.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 1:35 pm on December 2, 2016 Permalink | Reply
    Tags: , , Dark Energy, Dark Interactions Workshop,   

    From BNL: “Dark Interactions Workshop Hosts Physicists from Around the World” 

    Brookhaven Lab

    November 23, 2016
    Chelsea Whyte

    Dozens of experimental and theoretical physicists convened at the U.S. Department of Energy’s Brookhaven National Laboratory in October for the second biennial Dark Interactions Workshop. Attendees came from universities and laboratories worldwide to discuss current research and possible future searches for dark sector states such as dark matter.


    Two great cosmic mysteries – dark energy and dark matter — make up nearly 95% of the universe’s energy budget. Dark energy is the proposed agent behind the ever-increasing expansion of the universe. Some force must propel the accelerating rate at which the fabric of space is stretching, but its origin and makeup are still unknown. Dark matter, first proposed over 80 years ago, is theorized to be the mass responsible for most of the immense gravitational pull that galaxy clusters exert. Without its presence, galaxies and galaxy clusters shouldn’t hang together as they do, according to the laws of gravity that permeate our cosmos.

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey
    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Scientists know this much. It’s a bit like a map of a continent with the outlines drawn, but large holes that need a lot of filling in. “There are a lot of things we know that we don’t know,” said Brookhaven physicist Ketevi Assamagan, who organized the workshop along with Brookhaven physicists Hooman Davoudiasl and Mary Bishai, and Stony Brook University physicist Rouven Essig.

    The Dark Interactions Workshop was created to gather great minds in search of answers to these cosmic questions, and to share knowledge across the many different types of experiments searching for dark-sector particles. “The goals are to search for several well-motivated dark-sector particles with existing and upcoming experiments, but also to propose new experiments that can lead the search for dark forces in the coming decade. This requires in-depth discussions among theorists and experimentalists,” Essig said.

    The sessions ranged from discussing theories to status updates from dark-particle searches following the first workshop two years ago. Attendees included post-docs as well as tenured scientists, and Assamagan said workshops like this are crucial for allowing a diverse and somewhat disparate group of scientists in a dense field of study to get to know each other’s work and build collaborations.

    “Dark matter is one of the hot topics in particle and astrophysics today. We know that we don’t have the complete story when it comes to our universe. Understanding the nature of dark matter would be a revolution,” Assamagan said.

    While tantalizing theories have directed physicists to build new ways to search for dark sector states, conclusive evidence still eludes scientists. “Since there is currently a vast range of possibilities for what could constitute the dark sector, a variety of innovative approaches for answering this question need to be considered,” Davoudiasl said. “To that end, meetings like this are quite helpful as they facilitate the exchange of new ideas.”

    “There’s still a lot of hope. Meetings like this one show that there are a lot of clever people working in this field and a lot of collaboration between them. Hopefully at our next workshop, we’ll be sharing evidence that we’ve discovered something of the dark sector,” said Assamagan.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

  • richardmitnick 7:05 pm on November 30, 2016 Permalink | Reply
    Tags: , , Dark Energy, , , , ,   

    From Quanta: “The Case Against Dark Matter” 

    Quanta Magazine
    Quanta Magazine

    November 29, 2016
    Natalie Wolchover

    Erik Verlinde
    Ilvy Njiokiktjien for Quanta Magazine

    For 80 years, scientists have puzzled over the way galaxies and other cosmic structures appear to gravitate toward something they cannot see. This hypothetical “dark matter” seems to outweigh all visible matter by a startling ratio of five to one, suggesting that we barely know our own universe. Thousands of physicists are doggedly searching for these invisible particles.

    But the dark matter hypothesis assumes scientists know how matter in the sky ought to move in the first place. This month, a series of developments has revived a long-disfavored argument that dark matter doesn’t exist after all. In this view, no missing matter is needed to explain the errant motions of the heavenly bodies; rather, on cosmic scales, gravity itself works in a different way than either Isaac Newton or Albert Einstein predicted.

    The latest attempt to explain away dark matter is a much-discussed proposal by Erik Verlinde, a theoretical physicist at the University of Amsterdam who is known for bold and prescient, if sometimes imperfect, ideas. In a dense 51-page paper posted online on Nov. 7, Verlinde casts gravity as a byproduct of quantum interactions and suggests that the extra gravity attributed to dark matter is an effect of “dark energy” — the background energy woven into the space-time fabric of the universe.

    Instead of hordes of invisible particles, “dark matter is an interplay between ordinary matter and dark energy,” Verlinde said.

    To make his case, Verlinde has adopted a radical perspective on the origin of gravity that is currently in vogue among leading theoretical physicists. Einstein defined gravity as the effect of curves in space-time created by the presence of matter. According to the new approach, gravity is an emergent phenomenon. Space-time and the matter within it are treated as a hologram that arises from an underlying network of quantum bits (called “qubits”), much as the three-dimensional environment of a computer game is encoded in classical bits on a silicon chip. Working within this framework, Verlinde traces dark energy to a property of these underlying qubits that supposedly encode the universe. On large scales in the hologram, he argues, dark energy interacts with matter in just the right way to create the illusion of dark matter.

    In his calculations, Verlinde rediscovered the equations of “modified Newtonian dynamics,” or MOND. This 30-year-old theory makes an ad hoc tweak to the famous “inverse-square” law of gravity in Newton’s and Einstein’s theories in order to explain some of the phenomena attributed to dark matter. That this ugly fix works at all has long puzzled physicists. “I have a way of understanding the MOND success from a more fundamental perspective,” Verlinde said.

    Many experts have called Verlinde’s paper compelling but hard to follow. While it remains to be seen whether his arguments will hold up to scrutiny, the timing is fortuitous. In a new analysis of galaxies published on Nov. 9 in Physical Review Letters, three astrophysicists led by Stacy McGaugh of Case Western Reserve University in Cleveland, Ohio, have strengthened MOND’s case against dark matter.

    The researchers analyzed a diverse set of 153 galaxies, and for each one they compared the rotation speed of visible matter at any given distance from the galaxy’s center with the amount of visible matter contained within that galactic radius. Remarkably, these two variables were tightly linked in all the galaxies by a universal law, dubbed the “radial acceleration relation.” This makes perfect sense in the MOND paradigm, since visible matter is the exclusive source of the gravity driving the galaxy’s rotation (even if that gravity does not take the form prescribed by Newton or Einstein). With such a tight relationship between gravity felt by visible matter and gravity given by visible matter, there would seem to be no room, or need, for dark matter.

    Even as dark matter proponents rise to its defense, a third challenge has materialized. In new research that has been presented at seminars and is under review by the Monthly Notices of the Royal Astronomical Society, a team of Dutch astronomers have conducted what they call the first test of Verlinde’s theory: In comparing his formulas to data from more than 30,000 galaxies, Margot Brouwer of Leiden University in the Netherlands and her colleagues found that Verlinde correctly predicts the gravitational distortion or “lensing” of light from the galaxies — another phenomenon that is normally attributed to dark matter. This is somewhat to be expected, as MOND’s original developer, the Israeli astrophysicist Mordehai Milgrom, showed years ago that MOND accounts for gravitational lensing data. Verlinde’s theory will need to succeed at reproducing dark matter phenomena in cases where the old MOND failed.

    Kathryn Zurek, a dark matter theorist at Lawrence Berkeley National Laboratory, said Verlinde’s proposal at least demonstrates how something like MOND might be right after all. “One of the challenges with modified gravity is that there was no sensible theory that gives rise to this behavior,” she said. “If [Verlinde’s] paper ends up giving that framework, then that by itself could be enough to breathe more life into looking at [MOND] more seriously.”

    The New MOND

    In Newton’s and Einstein’s theories, the gravitational attraction of a massive object drops in proportion to the square of the distance away from it. This means stars orbiting around a galaxy should feel less gravitational pull — and orbit more slowly — the farther they are from the galactic center. Stars’ velocities do drop as predicted by the inverse-square law in the inner galaxy, but instead of continuing to drop as they get farther away, their velocities level off beyond a certain point. The “flattening” of galaxy rotation speeds, discovered by the astronomer Vera Rubin in the 1970s, is widely considered to be Exhibit A in the case for dark matter — explained, in that paradigm, by dark matter clouds or “halos” that surround galaxies and give an extra gravitational acceleration to their outlying stars.

    Searches for dark matter particles have proliferated — with hypothetical “weakly interacting massive particles” (WIMPs) and lighter-weight “axions” serving as prime candidates — but so far, experiments have found nothing.

    Lucy Reading-Ikkanda for Quanta Magazine

    Meanwhile, in the 1970s and 1980s, some researchers, including Milgrom, took a different tack. Many early attempts at tweaking gravity were easy to rule out, but Milgrom found a winning formula: When the gravitational acceleration felt by a star drops below a certain level — precisely 0.00000000012 meters per second per second, or 100 billion times weaker than we feel on the surface of the Earth — he postulated that gravity somehow switches from an inverse-square law to something close to an inverse-distance law. “There’s this magic scale,” McGaugh said. “Above this scale, everything is normal and Newtonian. Below this scale is where things get strange. But the theory does not really specify how you get from one regime to the other.”

    Physicists do not like magic; when other cosmological observations seemed far easier to explain with dark matter than with MOND, they left the approach for dead. Verlinde’s theory revitalizes MOND by attempting to reveal the method behind the magic.

    Verlinde, ruddy and fluffy-haired at 54 and lauded for highly technical string theory calculations, first jotted down a back-of-the-envelope version of his idea in 2010. It built on a famous paper he had written months earlier, in which he boldly declared that gravity does not really exist. By weaving together numerous concepts and conjectures at the vanguard of physics, he had concluded that gravity is an emergent thermodynamic effect, related to increasing entropy (or disorder). Then, as now, experts were uncertain what to make of the paper, though it inspired fruitful discussions.

    The particular brand of emergent gravity in Verlinde’s paper turned out not to be quite right, but he was tapping into the same intuition that led other theorists to develop the modern holographic description of emergent gravity and space-time — an approach that Verlinde has now absorbed into his new work.

    In this framework, bendy, curvy space-time and everything in it is a geometric representation of pure quantum information — that is, data stored in qubits. Unlike classical bits, qubits can exist simultaneously in two states (0 and 1) with varying degrees of probability, and they become “entangled” with each other, such that the state of one qubit determines the state of the other, and vice versa, no matter how far apart they are. Physicists have begun to work out the rules by which the entanglement structure of qubits mathematically translates into an associated space-time geometry. An array of qubits entangled with their nearest neighbors might encode flat space, for instance, while more complicated patterns of entanglement give rise to matter particles such as quarks and electrons, whose mass causes the space-time to be curved, producing gravity. “The best way we understand quantum gravity currently is this holographic approach,” said Mark Van Raamsdonk, a physicist at the University of British Columbia in Vancouver who has done influential work on the subject.

    The mathematical translations are rapidly being worked out for holographic universes with an Escher-esque space-time geometry known as anti-de Sitter (AdS) space, but universes like ours, which have de Sitter geometries, have proved far more difficult. In his new paper, Verlinde speculates that it’s exactly the de Sitter property of our native space-time that leads to the dark matter illusion.

    De Sitter space-times like ours stretch as you look far into the distance. For this to happen, space-time must be infused with a tiny amount of background energy — often called dark energy — which drives space-time apart from itself. Verlinde models dark energy as a thermal energy, as if our universe has been heated to an excited state. (AdS space, by contrast, is like a system in its ground state.) Verlinde associates this thermal energy with long-range entanglement between the underlying qubits, as if they have been shaken up, driving entangled pairs far apart. He argues that this long-range entanglement is disrupted by the presence of matter, which essentially removes dark energy from the region of space-time that it occupied. The dark energy then tries to move back into this space, exerting a kind of elastic response on the matter that is equivalent to a gravitational attraction.

    Because of the long-range nature of the entanglement, the elastic response becomes increasingly important in larger volumes of space-time. Verlinde calculates that it will cause galaxy rotation curves to start deviating from Newton’s inverse-square law at exactly the magic acceleration scale pinpointed by Milgrom in his original MOND theory.

    Van Raamsdonk calls Verlinde’s idea “definitely an important direction.” But he says it’s too soon to tell whether everything in the paper — which draws from quantum information theory, thermodynamics, condensed matter physics, holography and astrophysics — hangs together. Either way, Van Raamsdonk said, “I do find the premise interesting, and feel like the effort to understand whether something like that could be right could be enlightening.”

    One problem, said Brian Swingle of Harvard and Brandeis universities, who also works in holography, is that Verlinde lacks a concrete model universe like the ones researchers can construct in AdS space, giving him more wiggle room for making unproven speculations. “To be fair, we’ve gotten further by working in a more limited context, one which is less relevant for our own gravitational universe,” Swingle said, referring to work in AdS space. “We do need to address universes more like our own, so I hold out some hope that his new paper will provide some additional clues or ideas going forward.”

    Access mp4 video here .

    The Case for Dark Matter

    Verlinde could be capturing the zeitgeist the way his 2010 entropic-gravity paper did. Or he could be flat-out wrong. The question is whether his new and improved MOND can reproduce phenomena that foiled the old MOND and bolstered belief in dark matter.

    One such phenomenon is the Bullet cluster, a galaxy cluster in the process of colliding with another.

    X-ray photo by Chandra X-ray Observatory of the Bullet Cluster (1E0657-56). Exposure time was 0.5 million seconds (~140 hours) and the scale is shown in megaparsecs. Redshift (z) = 0.3, meaning its light has wavelengths stretched by a factor of 1.3. Based on today’s theories this shows the cluster to be about 4 billion light years away.
    In this photograph, a rapidly moving galaxy cluster with a shock wave trailing behind it seems to have hit another cluster at high speed. The gases collide, and gravitational fields of the stars and galalxies interact. When the galaxies collided, based on black-body temperture readings, the temperature reached 160 million degrees and X-rays were emitted in great intensity, claiming title of the hottest known galactic cluster.
    Studies of the Bullet cluster, announced in August 2006, provide the best evidence to date for the existence of dark matter.

    Superimposed mass density contours, caused by gravitational lensing of dark matter. Photograph taken with Hubble Space Telescope.
    Date 22 August 2006

    The visible matter in the two clusters crashes together, but gravitational lensing suggests that a large amount of dark matter, which does not interact with visible matter, has passed right through the crash site. Some physicists consider this indisputable proof of dark matter. However, Verlinde thinks his theory will be able to handle the Bullet cluster observations just fine. He says dark energy’s gravitational effect is embedded in space-time and is less deformable than matter itself, which would have allowed the two to separate during the cluster collision.

    But the crowning achievement for Verlinde’s theory would be to account for the suspected imprints of dark matter in the cosmic microwave background (CMB), ancient light that offers a snapshot of the infant universe.

    CMB per ESA/Planck
    CMB per ESA/Planck

    The snapshot reveals the way matter at the time repeatedly contracted due to its gravitational attraction and then expanded due to self-collisions, producing a series of peaks and troughs in the CMB data. Because dark matter does not interact, it would only have contracted without ever expanding, and this would modulate the amplitudes of the CMB peaks in exactly the way that scientists observe. One of the biggest strikes against the old MOND was its failure to predict this modulation and match the peaks’ amplitudes. Verlinde expects that his version will work — once again, because matter and the gravitational effect of dark energy can separate from each other and exhibit different behaviors. “Having said this,” he said, “I have not calculated this all through.”

    While Verlinde confronts these and a handful of other challenges, proponents of the dark matter hypothesis have some explaining of their own to do when it comes to McGaugh and his colleagues’ recent findings about the universal relationship between galaxy rotation speeds and their visible matter content.

    In October, responding to a preprint of the paper by McGaugh and his colleagues, two teams of astrophysicists independently argued that the dark matter hypothesis can account for the observations. They say the amount of dark matter in a galaxy’s halo would have precisely determined the amount of visible matter the galaxy ended up with when it formed. In that case, galaxies’ rotation speeds, even though they’re set by dark matter and visible matter combined, will exactly correlate with either their dark matter content or their visible matter content (since the two are not independent). However, computer simulations of galaxy formation do not currently indicate that galaxies’ dark and visible matter contents will always track each other. Experts are busy tweaking the simulations, but Arthur Kosowsky of the University of Pittsburgh, one of the researchers working on them, says it’s too early to tell if the simulations will be able to match all 153 examples of the universal law in McGaugh and his colleagues’ galaxy data set. If not, then the standard dark matter paradigm is in big trouble. “Obviously this is something that the community needs to look at more carefully,” Zurek said.

    Even if the simulations can be made to match the data, McGaugh, for one, considers it an implausible coincidence that dark matter and visible matter would conspire to exactly mimic the predictions of MOND at every location in every galaxy. “If somebody were to come to you and say, ‘The solar system doesn’t work on an inverse-square law, really it’s an inverse-cube law, but there’s dark matter that’s arranged just so that it always looks inverse-square,’ you would say that person is insane,” he said. “But that’s basically what we’re asking to be the case with dark matter here.”

    Given the considerable indirect evidence and near consensus among physicists that dark matter exists, it still probably does, Zurek said. “That said, you should always check that you’re not on a bandwagon,” she added. “Even though this paradigm explains everything, you should always check that there isn’t something else going on.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

  • richardmitnick 2:40 pm on November 25, 2016 Permalink | Reply
    Tags: Dark Energy, , GridPP, , Shear brilliance: computing tackles the mystery of the dark universe,   

    From U Manchester: “Shear brilliance: computing tackles the mystery of the dark universe” 

    U Manchester bloc

    University of Manchester

    24 November 2016
    No writer credit found

    Scientists from The University of Manchester working on a revolutionary telescope project have harnessed the power of distributed computing from the UK’s GridPP collaboration to tackle one of the Universe’s biggest mysteries – the nature of dark matter and dark energy.

    Researchers at The University of Manchester have used resources provided by GridPP – who represent the UK’s contribution to the computing grid used to find the Higgs boson at CERN – to run image processing and machine learning algorithms on thousands of images of galaxies from the international Dark Energy Survey.

    Dark Energy Icon

    The Manchester team are part of the collaborative project to build the Large Synoptic Survey Telescope (LSST), a new kind of telescope currently under construction in Chile and designed to conduct a 10-year survey of the dynamic Universe. LSST will be able to map the entire visible sky.

    LSST/Camera, built at SLAC
    LSST/Camera, built at SLAC

    LSST Interior
    LSST telescope, currently under construction at Cerro Pachón Chile
    LSST telescope, currently under construction at Cerro Pachón Chile

    In preparation to the LSST starting its revolutionary scanning, a pilot research project has helped researchers detect and map out the cosmic shear seen across the night sky, one of the tell-tale signs of the dark matter and dark energy thought to make up some 95 per cent of what we see in the Universe. This in turn will help prepare for the analysis of the expected 200 petabytes of data the LSST will collect when it starts operating in 2023.

    The pilot research team based at The Manchester of University was led by Dr Joe Zuntz, a cosmologist originally at Manchester’s Jodrell Bank Observatory and now a researcher at the Royal Observatory in Edinburgh.

    “Our overall aim is to tackle the mystery of the dark universe – and this pilot project has been hugely significant. When the LSST is fully operating researchers will face a galactic data deluge – and our work will prepare us for the analytical challenge ahead.”
    Sarah Bridle, Professor of Astrophysics

    Dr George Beckett, the LSST-UK Science Centre Project Manager based at The University of Edinburgh, added: “The pilot has been a great success. Having completed the work, Joe and his colleagues are able to carry out shear analysis on vast image sets much faster than was previously the case. Thanks are due to the members of the GridPP community for their assistance and support throughout.”

    The LSST will produce images of galaxies in a wide variety of frequency bands of the visible electromagnetic spectrum, with each image giving different information about the galaxy’s nature and history. In times gone by, the measurements needed to determine properties like cosmic shear might have been done by hand, or at least with human-supervised computer processing.

    With the billions of galaxies expected to be observed by LSST, such approaches are unfeasible. Specialised image processing and machine learning software (Zuntz 2013) has therefore been developed for use with galaxy images from telescopes like LSST and its predecessors. This can be used to produce cosmic shear maps like those shown in the figure below. The challenge then becomes one of processing and managing the data for hundreds of thousands of galaxies and extracting scientific results required by LSST researchers and the wider astrophysics community.

    As each galaxy is essentially independent of other galaxies in the catalogue, the image processing workflow itself is highly parallelisable. This makes it an ideal problem to tackle with the kind of High-Throughput Computing (HTP) resources and infrastructure offered by GridPP. In many ways, the data from CERN’s Large Hadron Collider particle collision events is like that produced by a digital camera (indeed, pixel-based detectors are used near the interaction points) – and GridPP regularly processes billions of such events as part of the Worldwide LHC Computing Grid (WLCG).

    A pilot exercise, led by Dr Joe Zuntz while at The University of Manchester and supported by one of the longest serving and most experienced GridPP experts, Senior System Administrator Alessandra Forti, saw the porting of the image analysis workflow to GridPP’s distributed computing infrastructure. Data from the Dark Energy Survey (DES) was used for the pilot.

    After transferring this data from the US to GridPP Storage Elements, and enabling the LSST Virtual Organisation on a number of GridPP Tier-2 sites, the IM3SHAPE analysis software package (Zuntz, 2013) was tested on local, grid-friendly client machines to ensure smooth running on the grid. Analysis jobs were then submitted and managed using the Ganga software suite, which is able to coordinate the thousands of individual analyses associated with each batch of galaxies. Initial runs were submitted using Ganga to local grid sites, but the pilot progressed to submission to multiple sites via the GridPP DIRAC (Distributed Infrastructure with Remote Agent Control) service. The flexibility of Ganga allows both types of submission, which made the transition from local to distributed running significantly easier.

    By the end of pilot, Dr Zuntz was able to run the image processing workflow on multiple GridPP sites, regularly submitting thousands of analysis jobs on DES images.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U Manchester campus

    The University of Manchester (UoM) is a public research university in the city of Manchester, England, formed in 2004 by the merger of the University of Manchester Institute of Science and Technology (renamed in 1966, est. 1956 as Manchester College of Science and Technology) which had its ultimate origins in the Mechanics’ Institute established in the city in 1824 and the Victoria University of Manchester founded by charter in 1904 after the dissolution of the federal Victoria University (which also had members in Leeds and Liverpool), but originating in Owens College, founded in Manchester in 1851. The University of Manchester is regarded as a red brick university, and was a product of the civic university movement of the late 19th century. It formed a constituent part of the federal Victoria University between 1880, when it received its royal charter, and 1903–1904, when it was dissolved.

    The University of Manchester is ranked 33rd in the world by QS World University Rankings 2015-16. In the 2015 Academic Ranking of World Universities, Manchester is ranked 41st in the world and 5th in the UK. In an employability ranking published by Emerging in 2015, where CEOs and chairmen were asked to select the top universities which they recruited from, Manchester placed 24th in the world and 5th nationally. The Global Employability University Ranking conducted by THE places Manchester at 27th world-wide and 10th in Europe, ahead of academic powerhouses such as Cornell, UPenn and LSE. It is ranked joint 56th in the world and 18th in Europe in the 2015-16 Times Higher Education World University Rankings. In the 2014 Research Excellence Framework, Manchester came fifth in terms of research power and seventeenth for grade point average quality when including specialist institutions. More students try to gain entry to the University of Manchester than to any other university in the country, with more than 55,000 applications for undergraduate courses in 2014 resulting in 6.5 applicants for every place available. According to the 2015 High Fliers Report, Manchester is the most targeted university by the largest number of leading graduate employers in the UK.

    The university owns and operates major cultural assets such as the Manchester Museum, Whitworth Art Gallery, John Rylands Library and Jodrell Bank Observatory which includes the Grade I listed Lovell Telescope.

  • richardmitnick 10:32 am on September 29, 2016 Permalink | Reply
    Tags: , AU, Dark Energy, , Stawell gold mine in western Victoria, Stawell Underground Physics Laboratory (SUPL)   

    From ARC Center of Excellence for Particle Physics at the Terascale: “Digging for Dark Matter” 


    ARC Centre of Excellence for Particle Physics at the Terascale

    Digging for Dark Matter

    A tiny Australian mining town might hold the key to solving one of the universe’s biggest mysteries – and to a local economic boom. What do scientists hope to find in a cave 1km underground?

    Lisa Clausen

    The public lookout point for the Stawell gold mine in western Victoria is an unremarkable spot; a few faded information boards and pieces of old equipment, rusting among crooked gums beside the mine’s high perimeter fence.

    Beyond the wire, near the slurry-coloured mine machinery which roars into the chill air, a dirt road descends steadily, through a rocky cutting, into the mine’s black mouth. It’s noisy, muddy and industrious – much like any number of working mines on a weekday – but not the sort of place where you’d imagine science might finally answer one of the great questions of our universe.

    And yet it could be. For more than 40 years, the mystery of dark matter has defied the world’s best physicists. These invisible particles are thought to be everywhere – constantly passing through each of us and our planet. In fact, we can only observe five per cent of the whole universe; the rest is dark matter and dark energy.

    Scientists have found compelling indirect evidence of dark matter’s existence, called gravitational lensing – where dark matter bends the visible light we see coming from distant galaxies. Yet this “stuff”, thought to shape galaxies and be the universe’s missing mass, remains frustratingly elusive. Directly detecting dark matter will be one of the greatest prizes of modern physics.

    “Dark matter holds galaxies together,” says University of Melbourne particle physicist, Professor Elisabetta Barberio. “If we understand it, we will understand how the universe evolved from the Big Bang to now, and how it might continue to evolve.”

    University of Melbourne particle physicist, Professor Elisabetta Barberio. Photo: Peter Casamento

    Barberio is the project leader of an ambitious experiment set to happen in Australia. Until now, efforts to find dark matter have all taken place in the northern hemisphere, with plenty of funding and facilities. Now, thanks to the Stawell Underground Physics Laboratory (SUPL), the southern hemisphere will join the global hunt.

    Underground Physics

    Its SABRE dark matter experiment will happen 1km underground in a country town of just over 6,000 people, best known for gold, farming, and a famous annual footrace – the Stawell Gift.


    Stawell’s 160 years of gold mining history have left a network of tunnels under its streets, and disused shafts which occasionally open up in people’s gardens. It’s very much a mining town, but faced disastrous news when, in late 2012, the mine’s then-owners came to town council warning of the mine’s potential closure. After all, at its peak the mine employed about 400 people, while hundreds of others in local businesses benefited from its success.

    A panel of councillors, council staff, locals and mine management was convened to brainstorm ideas for what might come next. The list of community proposals quickly grew: should they start growing mushrooms? Or open a subterranean hotel?

    That same year, three hours down the highway in Melbourne, a group of physicists was wondering where they could stage a dark matter detection experiment. Swinburne University of Technology astrophysicist Jeremy Mould wrote to several mines across the country, outlining the group’s unusual request for a spare underground cavern. One of those letters went to Stawell’s council.

    Probing the nature of the universe hadn’t been on Stawell’s short list – yet. But what physicists needed was an underground site deep enough and in the right sort of landscape to block out the highly radioactive cosmic rays which relentlessly pelt the Earth’s surface. To the experiment’s incredibly sensitive detector, these rays are like a raucous radio station. The best way to turn down the volume is to head underground.

    Stawell’s mine, in places dug almost 2km deep through dense volcanic basalt, looked promising. Because it’s a mine with ramp access, rather than a vertical shaft, people and equipment could be driven in. And because it was in operation, power, ventilation and internet access were already in place.

    Most importantly, initial background radiation readings inside the mine were encouragingly low. Talks with council began and, in 2014, 60 scientists from around the world arrived to inspect the proposed site of the southern hemisphere’s first underground physics lab.

    “The penny dropped then that there was really something in this,” says Northern Grampians Shire Mayor Murray Emerson.

    Now, with $3.5m from the Victorian and federal governments, construction is due to begin later this year. The rock above SUPL will be a radiation shield equivalent to 3km of water. Even so, maintaining the lowest radiation levels possible means a complex build.

    It means everything from concrete to rock bolts must be tested before it can be used. By May this year, 26 component samples from sources as widespread as Adelaide to Gladstone had been tested, but only two found suitable. Quarry materials such as sand and aggregate must travel by road or train for analysis at the Australian Nuclear Science and Technology Organisation in Sydney – air travel gives off too much radiation. Materials such as the special concrete spray coating for the rock walls will have to be mixed on-site in specific containers, and stored away from mine materials.

    “It’s certainly an unusual challenge,” says site project engineer Allan Ralph. “Things that you would do on the surface without thinking about them have a significant difficulty to them when they’re underground and forming part of a world-class physics laboratory.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The objectives for the ARC Centres of Excellence are to:

    undertake highly innovative and potentially transformational research that aims to achieve international standing in the fields of research envisaged and leads to a significant advancement of capabilities and knowledge
    link existing Australian research strengths and build critical mass with new capacity for interdisciplinary, collaborative approaches to address the most challenging and significant research problems
    develop relationships and build new networks with major national and international centres and research programs to help strengthen research, achieve global competitiveness and gain recognition for Australian research
    build Australia’s human capacity in a range of research areas by attracting and retaining, from within Australia and abroad, researchers of high international standing as well as the most promising research students
    provide high-quality postgraduate and postdoctoral training environments for the next generation of researchers
    offer Australian researchers opportunities to work on large-scale problems over long periods of time
    establish Centres that have an impact on the wider community through interaction with higher education institutes, governments, industry and the private and non-profit sector.

  • richardmitnick 4:57 pm on September 13, 2016 Permalink | Reply
    Tags: , , , Cosmic distance ladder, , Dark Energy,   

    From Ethan Siegel: “GAIA Satellite To Find Out If We’re Wrong About Dark Energy And The Expanding Universe” 

    From Ethan Siegel

    Sep 13, 2016

    ESA/Gaia satellite
    ESA/Gaia satellite

    How far away are the most distant objects in the Universe? How has the Universe expanded over the course of its history? And therefore, how big and how old is the Universe since the Big Bang? Through a number of ingenious developments, humanity has come up with two separate ways to answer these questions:

    To look at the minuscule fluctuations on all scales in the leftover glow from the Big Bang — the Cosmic Microwave Background — and to reconstruct the Universe’s composition and expansion history from that.
    To measure the distances to the stars, the nearby galaxies, and the more distant galaxies individually, and reconstruct the Universe’s expansion rate and history from this progressive “cosmic distance ladder.”

    The Gaia Deployable Sunshield Assembly (DSA) during deployment testing in the S1B integration building at Europe’s spaceport in Kourou, French Guiana, two months before launch. Image credit: ESA-M. Pedoussaut.

    Interestingly enough, these two methods disagree by a significant amount, and the European Space Agency’s GAIA satellite, poised for its first data release tomorrow, September 14th, intends to resolve it one way or another.

    Image credit: ESA and the Planck Collaboration, of the best-ever map of the fluctuations in the cosmic microwave background.

    The leftover glow from the Big Bang is only one data set, but it’s perhaps the most powerful data set we could have asked for nature to provide us with. It tells us the Universe expands with a Hubble constant of 67 km/s/Mpc, meaning that for every Megaparsec (about 3.26 million light years) a galaxy is apart from another, the expanding Universe pushes them apart at 67 km/s. The Cosmic Microwave Background also tells us how the Universe has expanded over its history, giving us a Universe that’s 68% dark energy, 32% dark-and-normal matter combined, and with an age of 13.81 billion years. Beginning with COBE and heavily refined later by BOOMERanG, WMAP and now Planck, this is perhaps the best data humanity has ever obtained for precision cosmology.



    The construction of the cosmic distance ladder involves going from our Solar System to the stars to nearby galaxies to distant ones. Each “step” carries along its own uncertainties. Image credit: NASA,ESA, A. Feild (STScI), and A. Riess (STScI/JHU).

    But there’s another way to measure how the Universe has expanded over its history: by constructing a cosmic distance ladder. One cannot simply look at a distant galaxy and know how far away it is from us; it took hundreds of years of astronomy just to learn that the sky’s great spirals and ellipticals weren’t even contained within the Milky Way! It took a tremendous series of steps to figure out how to measure astronomical distances accurately:

    We needed to learn how to measure Solar System distances, which took the developments of Newton and Kepler, plus the invention of the telescope.
    We needed to learn how to measure the distances to the stars, which relied on a geometric technique known as parallax, as a function of Earth’s motion in its orbit.
    We needed to learn how to classify stars and use properties that we could measure from those parallax stars in other galaxies, thereby learning our first galactic distances.
    And finally, we needed to identify other galactic properties that were measurable, such as surface brightness fluctuations, rotation speeds or supernovae within them, to measure the distances to the farthest galaxies.

    This latter method is older, more straightforward and requires far fewer assumptions. But it also disagrees with the Cosmic Microwave Background method, and has for a long time. In particular, the expansion rate looks to be about 10% faster: 74 km/s/Mpc instead of 67, meaning — if the distance ladder method is right — that the Universe is either younger and smaller than we thought, or that the amount of dark energy is different from what the other method indicates. There’s a big uncertainty there, however, and the largest component comes in the parallax measurement of the stars nearest to Earth.

    The parallax method, employed by GAIA, involves noting the apparent change in position of a nearby star relative to the more distant, background ones. Image credit: ESA/ATG medialab.

    This is where the GAIA satellite comes into play. Outstripping all previous efforts, GAIA will measure the brightnesses and positions of over one billion stars in the Milky Way, the largest survey ever undertaken of our own galaxy. It expects to do parallax measurements for millions of these to an accuracy of 20 micro-arc-seconds (µas), and for hundreds of millions more to an accuracy of 200 µas. All of the stars visible with the naked eye will do even better, with as little as 7 µas precision for everything visible to a human through a pair of binoculars.

    A map of star density in the Milky Way and surrounding sky, clearly showing the Milky Way, large and small Magellanic Clouds, and if you look more closely, NGC 104 to the left of the SMC, NGC 6205 slightly above and to the left of the galactic core, and NGC 7078 slightly below. Image credit: ESA/GAIA.

    GAIA was launched in 2013 and has been operational for nearly two full years at this point, meaning it’s collected data on all of these stars at many different points in our planet’s orbit around the Sun. Obtaining parallax measurements means we can get the full three-dimensional positions of these stars in space, and can even infer their proper motions at these accuracies, meaning we can dramatically reduce the uncertainties in the distances to the stars. What’s most spectacular is that many of these stars will be of the same types that we can measure in other star clusters and galaxies, enabling us to build a better, more robust cosmic distance ladder. When the GAIA results come out — and have been fully analyzed by the astronomical community — we’ll have our best-ever understanding of the Universe’s expansion history and of the distances to the farthest galaxies in the Universe, all because we measured what’s happening right here at home.

    Inflationary Universe. NASA/WMAP
    Inflationary Universe. NASA/WMAP

    Right now, the Cosmic Microwave Background and the cosmic distance ladder are giving us two different answers to the question of the age, expansion rate and composition of our Universe. They’re not very different, but the fact that they disagree points to one of two possible things. Either one (or both) of the measurements are in error, or there’s a fundamental tension between these two types of measurement that might mean our Universe is a funnier place than we’ve realized to date. When the results from GAIA come out tomorrow, the great hope of most astronomers is that the previous parallax measurements will be shown to have been in error, and our best understanding of the Universe will hold up and be vindicated. But nature has surprised us before, and — if you’re hoping for something new — keep in mind that it just might do so again.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

  • richardmitnick 1:01 pm on September 1, 2016 Permalink | Reply
    Tags: , , Dark Energy, , Super light particles?,   

    From Symmetry: “Universe steps on the gas” 

    Symmetry Mag


    Shannon Hall

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey
    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Dark Energy Survey

    A puzzling mismatch is forcing astronomers to re-think how well they understand the expansion of the universe.

    Astronomers think the universe might be expanding faster than expected.

    If true, it could reveal an extra wrinkle in our understanding of the universe, says Nobel Laureate Adam Riess of the Space Telescope Science Institute and Johns Hopkins University. That wrinkle might point toward new particles or suggest that the strength of dark energy, the mysterious force accelerating the expansion of the universe, actually changes over time.

    The result appears in a study published in The Astrophysical Journal this July, in which Riess’s team measured the current expansion rate of the universe, also known as the Hubble constant, better than ever before.

    In theory, determining this expansion is relatively simple, as long as you know the distance to a galaxy and the rate at which it is moving away from us. But distance measurements are tricky in practice and require using objects of known brightness, so-called standard candles, to gauge their distances.

    The use of Type Ia supernovae—exploding stars that shine with the same intrinsic luminosity—as standard candles led to the discovery that the universe was accelerating in the first place and earned Riess, as well as Saul Perlmutter and Brian Schmidt, a Nobel Prize in 2011.

    The latest measurement builds on that work and indicates that the universe is expanding by 73.2 kilometers per second per megaparsec (a unit that equals 3.3 million light-years). Think about dividing the universe into grids that are each a megaparsec long. Every time you reach a new grid, the universe is expanding 73.2 kilometers per second faster than the grid before.

    Although the analysis pegs the Hubble constant to within experimental errors of just 2.4 percent, the latest result doesn’t match the expansion rate predicted from the universe’s trajectory. Here, astronomers measure the expansion rate from the radiation released 380,000 years after the Big Bang and then run that expansion forward in order to calculate what today’s expansion rate should be.

    It’s similar to throwing a ball in the air, Riess says. If you understand the state of the ball (how fast it’s traveling and where it is) and the physics (gravity and drag), then you should be able to precisely predict how fast that ball is traveling later on.

    “So in this case, instead of a ball, it’s the whole universe, and we think we should be able to predict how fast it’s expanding today,” Riess says. “But the caveat, I would say, is that most of the universe is in a dark form that we don’t understand.”

    The rates predicted from measurements made on the early universe with the Planck satellite are 9 percent smaller than the rates measured by Riess’ team—a puzzling mismatch that suggests the universe could be expanding faster than physicists think it should.

    David Kaplan, a theorist at Johns Hopkins University who was not involved with the study, is intrigued by the discrepancy because it could be easily explained with the addition of a new theory, or even a slight tweak to a current theory.

    “Sometimes there’s a weird discrepancy or signal and you think ‘holy cow, how am I ever going to explain that?’” Kaplan says. “You try to come up with some cockamamie theory. This, on the other hand, is something that lives in a regime where it’s really easy to explain it with new degrees of freedom.”

    Kaplan’s favorite explanation is that there’s an undiscovered particle, which would affect the expansion rate in the early universe. “If there are super light particles that haven’t been taken into account yet and they make up some smallish fraction of the universe, it seems that can explain the discrepancy relatively comfortably,” he says.

    But others disagree. “We understand so little about dark energy that it’s tempting to point to something there,” says David Spergel, an astronomer from Princeton University who was also not involved in the study. One explanation is that dark energy, the cause of the universe’s accelerating expansion, is growing stronger with time.

    “The idea is that if dark energy is constant, clusters of galaxies are moving apart from each other but the clusters of galaxies themselves will remain forever bound,” says Alex Filippenko, an astronomer at the University of California, Berkeley and a co-author on Riess’ paper. But if dark energy is growing in strength over time, then one day—far in the future—even clusters of galaxies will get ripped apart. And the trend doesn’t stop there, he says. Galaxies, clusters of stars, stars, planetary systems, planets, and then even atoms will be torn to shreds one by one.

    The implications could—literally—be Earth-shattering. But it’s also possible that one of the two measurements is wrong, so both teams are currently working toward even more precise measurements. The latest discrepancy is also relatively minor compared to past disagreements.

    “I’m old enough to remember when I was first a student and went to conferences and people argued over whether the Hubble constant was 50 or 100,” says Spergel. “We’re now in a situation where the low camp is arguing for 67 and the high camp is arguing for 73. So we’ve made progress! And that’s not to belittle this discrepancy. I think it’s really interesting. It could be the signature of new physics.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 9:43 am on July 28, 2016 Permalink | Reply
    Tags: , , Could Dark Energy Be Caused By A Reaction To What's In The Universe?, Dark Energy,   

    From Ethan Siegel: “Could Dark Energy Be Caused By A Reaction To What’s In The Universe?” 

    From Ethan Siegel

    Jul 28, 2016

    The full UV-visible-IR composite of the XDF; the greatest image ever released of the distant Universe. Every galaxy shown here will eventually accelerate away from us at greater than the speed of light, thanks to dark energy. Image credit: NASA, ESA, H. Teplitz and M. Rafelski (IPAC/Caltech), A. Koekemoer (STScI), R. Windhorst (Arizona State University), and Z. Levay (STScI).

    Most of the forces and phenomena in the Universe have causes that can be easily uncovered. Two massive objects experience a gravitational force due to the fact that spacetime is curved by the presence of matter and energy. The Universe has expanded as it has over its history because of the changing energy density of the Universe and the initial expansion conditions. And all the particles in the Universe experience the interactions they do because of the known rules of quantum field theory and the exchange of vector bosons. From the smallest, subatomic particles to the largest scales of all, the same forces are at play, holding everything from protons to people to planets to galaxies together.

    Image credit: Wikipedia / Wikimedia Commons user Qashqaiilove.

    Even some of the more mysterious phenomena have underlying explanations that are well-understood. We don’t know how there got to be more matter than antimatter in the Universe, but we know that the conditions we need for it — baryon number violation, out of equilibrium conditions and C and CP-violation — all exist. We don’t know what the nature of dark matter is, but its generic properties, where it’s located and how it clumps together is well-understood. And we don’t know whether black holes preserve information or not, but we understand the final and initial states of these objects, as well as how they come to be and what happens to their event horizons over time.

    Ilustration of a black hole and its surrounding, accelerating and infalling accretion disk. Image credit: NASA.

    But there’s one thing we don’t understand at all: dark energy. Sure, we can measure the acceleration of the Universe, and determine exactly what the magnitude of it is. But why do we have a Universe with a non-zero value for dark energy at all? Why should empty space, devoid of everything — no matter, no curvature, no radiation, no anything — have a positive, non-zero energy? Why should it cause the Universe itself to expand at an always-positive, never-reaching-towards-zero rate? And why should that amount of energy that it has be so unbelievably tiny, that it was completely unnoticeable for the first few billion years of the Universe’s history, and only came to dominate the Universe around the time that Earth was being formed?

    Protoplanetary disks, which all solar systems are thought to form with, will coalesce into planets over time. Illustration credit: NAOJ.

    There’s a lot of empty space, and we know there are quantum fields all throughout it. There are no regions of the Universe where the gravitational, electromagnetic or nuclear forces can’t reach; they are absolutely everywhere. If we try and calculate what we call the vacuum expectation value (VEV) of the different quantum fields in there, we first off can only do it approximately, because there are an infinite number of terms we can write down that go to arbitrarily high order. If we truncate the series at any point, we can add up what the approximate contributions are, and we wind up very disappointed.

    A few terms contributing to the zero-point energy in quantum electrodynamics. Image credit: R. L. Jaffe, from https://arxiv.org/pdf/hep-th/0503158.pdf.

    We wind up with contributions that are approximately 120 orders of magnitude too big, both positive and negative. As far as we can tell, they don’t cancel exactly, and even if they did, we still have that pesky observational problem that the Universe isn’t recollapsing, slowing down or asymptoting to a zero rate; it’s really, truly accelerating. Somehow there’s a small but non-zero energy inherent to space itself.

    The four possible fates of our Universe into the future; the last one appears to be the Universe we live in, dominated by dark energy. Image credit: E. Siegel.

    Perhaps the biggest theoretical question of all is why. We literally have no good explanation for what’s the cause of this dark energy. We recently looked at the possibility that it’s frozen neutrinos, or it could be a symptom of us having something wrong with the expanding Universe. But there’s another possibility that gets very little attention that ought to get a lot more: it could be a property of empty space itself that is caused by the presence of other things — effective boundaries — in the Universe.

    And the reason this is possible is because this is an effect that we know exists: the Casimir effect.

    An illustration of the Casimir effect, and how the forces on the outside of the plates are different from the forces on the inside. Image credit: Wikimedia commons user Emok, under a c.c.a.-by-s.a.-3.0 license.

    What’s the electromagnetic force of empty space? It’s nothing, of course. You wouldn’t even be wrong for saying that it’s nothing. But put two metal plates a finite distance apart, and then ask what the electromagnetic force is, and you find it isn’t zero! Due to the fact that some of the vacuum fluctuation modes are forbidden due to the boundaries of the plates, we not only predict but measure a non-zero force between these plates, arising from nothing other than empty space itself. As it turns out, all of the forces, including the gravitational force, exhibit a Casimir effect as well.

    A map of more than one milion galaxies in the Universe, where each dot is its own galaxy. Image credit: Daniel Eisenstein and the SDSS-III collaboration.

    So what happens if we apply this effect to the entire Universe, and try to calculate what the effect ought to be? The answer is simple: we get something that has a form that’s consistent with dark energy, although — once again — the magnitude is all wrong. This is quite possibly, though, a function of the fact that we don’t know what the boundary conditions of the Universe look like, or how to calculate this quantum gravitational effect very well. But it’s an incredible, well-researched possibility that has lots of interesting developments ongoing over the past decade.

    The 3D reconstruction of 120,000 galaxies and their clustering properties, inferred from their redshift and large-scale structure formation. Image credit: Jeremy Tinker and the SDSS-III collaboration.

    Mapping the Universe might turn out to be the easy part. Perhaps it’s not going to be an observational or experimental breakthrough that will lead us to understanding dark energy, the most elusive force in the Universe. Perhaps it’s a theoretical one that’s needed. And perhaps it’s related to the trace anomaly, perhaps its a dynamical quantity that’s changed over time, and perhaps it’s even a sign of extra dimensions. The Universe is out there, and we’ve only recently uncovered this most difficult-to-explain secret. Perhaps the solution, if we’re careful, might lie in the physics we already know.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

  • richardmitnick 5:21 am on July 21, 2016 Permalink | Reply
    Tags: , , , Dark Energy   

    From Science: “Attempt to explain away ‘dark energy’ takes a hit” 



    Jul. 19, 2016
    Adrian Cho

    A galaxy cluster as observed by the Canada-France-Hawaii telescope. Gravity from the cluster distorts the images of other galaxies in the background. L. Van Waerbeke and C. Heymans/CFHTLenS collaboration

    For nearly 20 years, physicists have known that the expansion of the universe has begun to speed up. This bizarre acceleration could arise because some form of mysterious dark energy is stretching space. Or, it could signal that physicists’ understanding of gravity isn’t quite right. But a new study puts the screws on a broad class of alternative theories of gravity, making it that much harder to explain away dark energy.

    The study is also path setting because it exploits an effect called weak lensing in which the gravity from closer galaxies distorts the images of more distant ones. “That’s the future,” says Bob Nichol, an observational cosmologist at the University of Portsmouth in the United Kingdom who was not involved in the study. “If you look to the next decade, there’s going to be an explosion of this data.”

    Physicists had expected the universe’s expansion to be slowing as the galaxies pull on one another with their gravity. But in 1998, two independent teams traced the history of the universe’s expansion by studying type 1a supernovae: stellar explosions whose colors tell when they went off and whose brightness reveals how far away they are now. Both teams found that the expansion is speeding up, suggesting that dark energy is blowing up the universe like a balloon.

    However, it’s possible that dark energy doesn’t exist and that the acceleration comes about instead because physicists’ understanding of gravity—Albert Einstein’s general theory of relativity—isn’t quite right. Einstein deduced that gravity arises because mass and energy warp spacetime. In general relativity, given the distribution of mass and energy, spacetime bends to minimize its curvature, denoted R. But in so-called f(R) (pronounced “eff-of-are”) theories, spacetime contorts to minimize the curvature plus some extra function of the curvature. That change produces an extra gravitylike force that can either attract or repel under different conditions.

    In 2007, theorists Wayne Hu of the University of Chicago in Illinois and Ignacy Sawicki, now at the University of Geneva in Switzerland, showed that, with the right choice of the function f(R), such a theory might explain the accelerating expansion without dark energy. To do that, the extra force has to disappear where gravity is relatively strong, such as within a galaxy or the early universe, and kick in on the largest scales and at later times.

    This map shows the distribution of matter—dark and ordinary—deduced through weak lensing. Cosmologists used such a map to test an alternate theory of gravity. Van Waerbeke, Heymans/CFHTLenS

    To test such theories, scientists must study the universe on huge scales. Last year, Nichol and colleagues tested f(R) theory by tallying galaxy clusters spanning millions of light-years. If dark energy is stretching space, then it should slow the formation of massive clusters and produce fewer of them than f(R) gravity would. Nichol and colleagues found numbers consistent with dark energy. The analysis is tricky, however. Researchers need to estimate the mass of each cluster, which comes mostly from mysterious, invisible dark matter. So Nichol and colleagues inferred a cluster’s mass from x-rays coming from hot gas within it, relying on theoretical modeling of the interplay of ordinary and dark matter.

    Now, a team of scientists led by Zuhui Fan, an astronomer at Peking University in Beijing, has taken an approach that measures a cluster’s mass directly. Gravity from a massive object can distort the images of things beyond it. A galaxy cluster thus distorts the images of more distant galaxies, so that instead of being oriented randomly in the sky, their elongated shapes align slightly, like fish in a school. The strength of that “weak lensing” directly reveals the mass of the foreground cluster. “You don’t rely on the scaling between the cluster’s mass and its [ordinary matter] content,” says Baojiu Li, a cosmologist at Durham University in the United Kingdom, who worked on the study.

    The researchers used data from the 3.6-meter Canada-France-Hawaii Telescope on Mauna Kea in Hawaii, which imaged 5.5 million galaxies to create a weak lensing map covering 154 square degrees of sky. From the “peaks” in the map, they tallied clusters weighing hundreds of times much as our Milky Way galaxy, they report in a paper in press at Physical Review Letters. Those tallies agree with the predictions of dark energy and weaken the case for f(R) theories.

    CFHT Telescope, Mauna Kea, Hawaii, USA
    CFHT Interior
    CFHT Telescope, Mauna Kea, Hawaii, USA

    “At the moment, this is the best measurement on the cosmological scale,” Nichol says. The new result doesn’t quite kill f(R) theory, but if the limit on a key parameter can be lowered by another factor of 10, Nichol says, “I suspect that people will say, ‘This theory is not it.'”

    However, Hu questions how far the method can be pushed. Testing f(R) gravity further may require accounting for the detailed distribution of dark matter within individual clusters, he says. But that distribution will be modified by the interplay between dark and ordinary matter, Hu says, bringing the issue back into play.

    Still, experts say, the new work shows the potential to probe the cosmos with weak lensing. The Large Synoptic Survey Telescope, under construction in Cerro Pachón, Chile, will map weak lensing over 20,000 square degrees—roughly half the sky. The European Space Agency’s proposed Euclid spacecraft and NASA’s proposed Wide Field Infrared Survey Telescope satellite [WFIRST] will employ the technique. “In terms of data quality,” Li says, “there’s going to be a big improvement from what we have now.”

    LSST telescope, currently under construction at Cerro Pachón Chile
    LSST telescope, currently under construction at Cerro Pachón Chile

    ESA/Euclid spacecraft
    ESA/Euclid spacecraft


    See the full article here .

    The American Association for the Advancement of Science is an international non-profit organization dedicated to advancing science for the benefit of all people.

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: