Tagged: Dark Matter Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:35 pm on December 2, 2016 Permalink | Reply
    Tags: , , , Dark Interactions Workshop, Dark Matter   

    From BNL: “Dark Interactions Workshop Hosts Physicists from Around the World” 

    Brookhaven Lab

    November 23, 2016
    Chelsea Whyte

    Dozens of experimental and theoretical physicists convened at the U.S. Department of Energy’s Brookhaven National Laboratory in October for the second biennial Dark Interactions Workshop. Attendees came from universities and laboratories worldwide to discuss current research and possible future searches for dark sector states such as dark matter.


    Two great cosmic mysteries – dark energy and dark matter — make up nearly 95% of the universe’s energy budget. Dark energy is the proposed agent behind the ever-increasing expansion of the universe. Some force must propel the accelerating rate at which the fabric of space is stretching, but its origin and makeup are still unknown. Dark matter, first proposed over 80 years ago, is theorized to be the mass responsible for most of the immense gravitational pull that galaxy clusters exert. Without its presence, galaxies and galaxy clusters shouldn’t hang together as they do, according to the laws of gravity that permeate our cosmos.

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey
    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Scientists know this much. It’s a bit like a map of a continent with the outlines drawn, but large holes that need a lot of filling in. “There are a lot of things we know that we don’t know,” said Brookhaven physicist Ketevi Assamagan, who organized the workshop along with Brookhaven physicists Hooman Davoudiasl and Mary Bishai, and Stony Brook University physicist Rouven Essig.

    The Dark Interactions Workshop was created to gather great minds in search of answers to these cosmic questions, and to share knowledge across the many different types of experiments searching for dark-sector particles. “The goals are to search for several well-motivated dark-sector particles with existing and upcoming experiments, but also to propose new experiments that can lead the search for dark forces in the coming decade. This requires in-depth discussions among theorists and experimentalists,” Essig said.

    The sessions ranged from discussing theories to status updates from dark-particle searches following the first workshop two years ago. Attendees included post-docs as well as tenured scientists, and Assamagan said workshops like this are crucial for allowing a diverse and somewhat disparate group of scientists in a dense field of study to get to know each other’s work and build collaborations.

    “Dark matter is one of the hot topics in particle and astrophysics today. We know that we don’t have the complete story when it comes to our universe. Understanding the nature of dark matter would be a revolution,” Assamagan said.

    While tantalizing theories have directed physicists to build new ways to search for dark sector states, conclusive evidence still eludes scientists. “Since there is currently a vast range of possibilities for what could constitute the dark sector, a variety of innovative approaches for answering this question need to be considered,” Davoudiasl said. “To that end, meetings like this are quite helpful as they facilitate the exchange of new ideas.”

    “There’s still a lot of hope. Meetings like this one show that there are a lot of clever people working in this field and a lot of collaboration between them. Hopefully at our next workshop, we’ll be sharing evidence that we’ve discovered something of the dark sector,” said Assamagan.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

  • richardmitnick 7:05 pm on November 30, 2016 Permalink | Reply
    Tags: , , , Dark Matter, , , ,   

    From Quanta: “The Case Against Dark Matter” 

    Quanta Magazine
    Quanta Magazine

    November 29, 2016
    Natalie Wolchover

    Erik Verlinde
    Ilvy Njiokiktjien for Quanta Magazine

    For 80 years, scientists have puzzled over the way galaxies and other cosmic structures appear to gravitate toward something they cannot see. This hypothetical “dark matter” seems to outweigh all visible matter by a startling ratio of five to one, suggesting that we barely know our own universe. Thousands of physicists are doggedly searching for these invisible particles.

    But the dark matter hypothesis assumes scientists know how matter in the sky ought to move in the first place. This month, a series of developments has revived a long-disfavored argument that dark matter doesn’t exist after all. In this view, no missing matter is needed to explain the errant motions of the heavenly bodies; rather, on cosmic scales, gravity itself works in a different way than either Isaac Newton or Albert Einstein predicted.

    The latest attempt to explain away dark matter is a much-discussed proposal by Erik Verlinde, a theoretical physicist at the University of Amsterdam who is known for bold and prescient, if sometimes imperfect, ideas. In a dense 51-page paper posted online on Nov. 7, Verlinde casts gravity as a byproduct of quantum interactions and suggests that the extra gravity attributed to dark matter is an effect of “dark energy” — the background energy woven into the space-time fabric of the universe.

    Instead of hordes of invisible particles, “dark matter is an interplay between ordinary matter and dark energy,” Verlinde said.

    To make his case, Verlinde has adopted a radical perspective on the origin of gravity that is currently in vogue among leading theoretical physicists. Einstein defined gravity as the effect of curves in space-time created by the presence of matter. According to the new approach, gravity is an emergent phenomenon. Space-time and the matter within it are treated as a hologram that arises from an underlying network of quantum bits (called “qubits”), much as the three-dimensional environment of a computer game is encoded in classical bits on a silicon chip. Working within this framework, Verlinde traces dark energy to a property of these underlying qubits that supposedly encode the universe. On large scales in the hologram, he argues, dark energy interacts with matter in just the right way to create the illusion of dark matter.

    In his calculations, Verlinde rediscovered the equations of “modified Newtonian dynamics,” or MOND. This 30-year-old theory makes an ad hoc tweak to the famous “inverse-square” law of gravity in Newton’s and Einstein’s theories in order to explain some of the phenomena attributed to dark matter. That this ugly fix works at all has long puzzled physicists. “I have a way of understanding the MOND success from a more fundamental perspective,” Verlinde said.

    Many experts have called Verlinde’s paper compelling but hard to follow. While it remains to be seen whether his arguments will hold up to scrutiny, the timing is fortuitous. In a new analysis of galaxies published on Nov. 9 in Physical Review Letters, three astrophysicists led by Stacy McGaugh of Case Western Reserve University in Cleveland, Ohio, have strengthened MOND’s case against dark matter.

    The researchers analyzed a diverse set of 153 galaxies, and for each one they compared the rotation speed of visible matter at any given distance from the galaxy’s center with the amount of visible matter contained within that galactic radius. Remarkably, these two variables were tightly linked in all the galaxies by a universal law, dubbed the “radial acceleration relation.” This makes perfect sense in the MOND paradigm, since visible matter is the exclusive source of the gravity driving the galaxy’s rotation (even if that gravity does not take the form prescribed by Newton or Einstein). With such a tight relationship between gravity felt by visible matter and gravity given by visible matter, there would seem to be no room, or need, for dark matter.

    Even as dark matter proponents rise to its defense, a third challenge has materialized. In new research that has been presented at seminars and is under review by the Monthly Notices of the Royal Astronomical Society, a team of Dutch astronomers have conducted what they call the first test of Verlinde’s theory: In comparing his formulas to data from more than 30,000 galaxies, Margot Brouwer of Leiden University in the Netherlands and her colleagues found that Verlinde correctly predicts the gravitational distortion or “lensing” of light from the galaxies — another phenomenon that is normally attributed to dark matter. This is somewhat to be expected, as MOND’s original developer, the Israeli astrophysicist Mordehai Milgrom, showed years ago that MOND accounts for gravitational lensing data. Verlinde’s theory will need to succeed at reproducing dark matter phenomena in cases where the old MOND failed.

    Kathryn Zurek, a dark matter theorist at Lawrence Berkeley National Laboratory, said Verlinde’s proposal at least demonstrates how something like MOND might be right after all. “One of the challenges with modified gravity is that there was no sensible theory that gives rise to this behavior,” she said. “If [Verlinde’s] paper ends up giving that framework, then that by itself could be enough to breathe more life into looking at [MOND] more seriously.”

    The New MOND

    In Newton’s and Einstein’s theories, the gravitational attraction of a massive object drops in proportion to the square of the distance away from it. This means stars orbiting around a galaxy should feel less gravitational pull — and orbit more slowly — the farther they are from the galactic center. Stars’ velocities do drop as predicted by the inverse-square law in the inner galaxy, but instead of continuing to drop as they get farther away, their velocities level off beyond a certain point. The “flattening” of galaxy rotation speeds, discovered by the astronomer Vera Rubin in the 1970s, is widely considered to be Exhibit A in the case for dark matter — explained, in that paradigm, by dark matter clouds or “halos” that surround galaxies and give an extra gravitational acceleration to their outlying stars.

    Searches for dark matter particles have proliferated — with hypothetical “weakly interacting massive particles” (WIMPs) and lighter-weight “axions” serving as prime candidates — but so far, experiments have found nothing.

    Lucy Reading-Ikkanda for Quanta Magazine

    Meanwhile, in the 1970s and 1980s, some researchers, including Milgrom, took a different tack. Many early attempts at tweaking gravity were easy to rule out, but Milgrom found a winning formula: When the gravitational acceleration felt by a star drops below a certain level — precisely 0.00000000012 meters per second per second, or 100 billion times weaker than we feel on the surface of the Earth — he postulated that gravity somehow switches from an inverse-square law to something close to an inverse-distance law. “There’s this magic scale,” McGaugh said. “Above this scale, everything is normal and Newtonian. Below this scale is where things get strange. But the theory does not really specify how you get from one regime to the other.”

    Physicists do not like magic; when other cosmological observations seemed far easier to explain with dark matter than with MOND, they left the approach for dead. Verlinde’s theory revitalizes MOND by attempting to reveal the method behind the magic.

    Verlinde, ruddy and fluffy-haired at 54 and lauded for highly technical string theory calculations, first jotted down a back-of-the-envelope version of his idea in 2010. It built on a famous paper he had written months earlier, in which he boldly declared that gravity does not really exist. By weaving together numerous concepts and conjectures at the vanguard of physics, he had concluded that gravity is an emergent thermodynamic effect, related to increasing entropy (or disorder). Then, as now, experts were uncertain what to make of the paper, though it inspired fruitful discussions.

    The particular brand of emergent gravity in Verlinde’s paper turned out not to be quite right, but he was tapping into the same intuition that led other theorists to develop the modern holographic description of emergent gravity and space-time — an approach that Verlinde has now absorbed into his new work.

    In this framework, bendy, curvy space-time and everything in it is a geometric representation of pure quantum information — that is, data stored in qubits. Unlike classical bits, qubits can exist simultaneously in two states (0 and 1) with varying degrees of probability, and they become “entangled” with each other, such that the state of one qubit determines the state of the other, and vice versa, no matter how far apart they are. Physicists have begun to work out the rules by which the entanglement structure of qubits mathematically translates into an associated space-time geometry. An array of qubits entangled with their nearest neighbors might encode flat space, for instance, while more complicated patterns of entanglement give rise to matter particles such as quarks and electrons, whose mass causes the space-time to be curved, producing gravity. “The best way we understand quantum gravity currently is this holographic approach,” said Mark Van Raamsdonk, a physicist at the University of British Columbia in Vancouver who has done influential work on the subject.

    The mathematical translations are rapidly being worked out for holographic universes with an Escher-esque space-time geometry known as anti-de Sitter (AdS) space, but universes like ours, which have de Sitter geometries, have proved far more difficult. In his new paper, Verlinde speculates that it’s exactly the de Sitter property of our native space-time that leads to the dark matter illusion.

    De Sitter space-times like ours stretch as you look far into the distance. For this to happen, space-time must be infused with a tiny amount of background energy — often called dark energy — which drives space-time apart from itself. Verlinde models dark energy as a thermal energy, as if our universe has been heated to an excited state. (AdS space, by contrast, is like a system in its ground state.) Verlinde associates this thermal energy with long-range entanglement between the underlying qubits, as if they have been shaken up, driving entangled pairs far apart. He argues that this long-range entanglement is disrupted by the presence of matter, which essentially removes dark energy from the region of space-time that it occupied. The dark energy then tries to move back into this space, exerting a kind of elastic response on the matter that is equivalent to a gravitational attraction.

    Because of the long-range nature of the entanglement, the elastic response becomes increasingly important in larger volumes of space-time. Verlinde calculates that it will cause galaxy rotation curves to start deviating from Newton’s inverse-square law at exactly the magic acceleration scale pinpointed by Milgrom in his original MOND theory.

    Van Raamsdonk calls Verlinde’s idea “definitely an important direction.” But he says it’s too soon to tell whether everything in the paper — which draws from quantum information theory, thermodynamics, condensed matter physics, holography and astrophysics — hangs together. Either way, Van Raamsdonk said, “I do find the premise interesting, and feel like the effort to understand whether something like that could be right could be enlightening.”

    One problem, said Brian Swingle of Harvard and Brandeis universities, who also works in holography, is that Verlinde lacks a concrete model universe like the ones researchers can construct in AdS space, giving him more wiggle room for making unproven speculations. “To be fair, we’ve gotten further by working in a more limited context, one which is less relevant for our own gravitational universe,” Swingle said, referring to work in AdS space. “We do need to address universes more like our own, so I hold out some hope that his new paper will provide some additional clues or ideas going forward.”

    Access mp4 video here .

    The Case for Dark Matter

    Verlinde could be capturing the zeitgeist the way his 2010 entropic-gravity paper did. Or he could be flat-out wrong. The question is whether his new and improved MOND can reproduce phenomena that foiled the old MOND and bolstered belief in dark matter.

    One such phenomenon is the Bullet cluster, a galaxy cluster in the process of colliding with another.

    X-ray photo by Chandra X-ray Observatory of the Bullet Cluster (1E0657-56). Exposure time was 0.5 million seconds (~140 hours) and the scale is shown in megaparsecs. Redshift (z) = 0.3, meaning its light has wavelengths stretched by a factor of 1.3. Based on today’s theories this shows the cluster to be about 4 billion light years away.
    In this photograph, a rapidly moving galaxy cluster with a shock wave trailing behind it seems to have hit another cluster at high speed. The gases collide, and gravitational fields of the stars and galalxies interact. When the galaxies collided, based on black-body temperture readings, the temperature reached 160 million degrees and X-rays were emitted in great intensity, claiming title of the hottest known galactic cluster.
    Studies of the Bullet cluster, announced in August 2006, provide the best evidence to date for the existence of dark matter.

    Superimposed mass density contours, caused by gravitational lensing of dark matter. Photograph taken with Hubble Space Telescope.
    Date 22 August 2006

    The visible matter in the two clusters crashes together, but gravitational lensing suggests that a large amount of dark matter, which does not interact with visible matter, has passed right through the crash site. Some physicists consider this indisputable proof of dark matter. However, Verlinde thinks his theory will be able to handle the Bullet cluster observations just fine. He says dark energy’s gravitational effect is embedded in space-time and is less deformable than matter itself, which would have allowed the two to separate during the cluster collision.

    But the crowning achievement for Verlinde’s theory would be to account for the suspected imprints of dark matter in the cosmic microwave background (CMB), ancient light that offers a snapshot of the infant universe.

    CMB per ESA/Planck
    CMB per ESA/Planck

    The snapshot reveals the way matter at the time repeatedly contracted due to its gravitational attraction and then expanded due to self-collisions, producing a series of peaks and troughs in the CMB data. Because dark matter does not interact, it would only have contracted without ever expanding, and this would modulate the amplitudes of the CMB peaks in exactly the way that scientists observe. One of the biggest strikes against the old MOND was its failure to predict this modulation and match the peaks’ amplitudes. Verlinde expects that his version will work — once again, because matter and the gravitational effect of dark energy can separate from each other and exhibit different behaviors. “Having said this,” he said, “I have not calculated this all through.”

    While Verlinde confronts these and a handful of other challenges, proponents of the dark matter hypothesis have some explaining of their own to do when it comes to McGaugh and his colleagues’ recent findings about the universal relationship between galaxy rotation speeds and their visible matter content.

    In October, responding to a preprint of the paper by McGaugh and his colleagues, two teams of astrophysicists independently argued that the dark matter hypothesis can account for the observations. They say the amount of dark matter in a galaxy’s halo would have precisely determined the amount of visible matter the galaxy ended up with when it formed. In that case, galaxies’ rotation speeds, even though they’re set by dark matter and visible matter combined, will exactly correlate with either their dark matter content or their visible matter content (since the two are not independent). However, computer simulations of galaxy formation do not currently indicate that galaxies’ dark and visible matter contents will always track each other. Experts are busy tweaking the simulations, but Arthur Kosowsky of the University of Pittsburgh, one of the researchers working on them, says it’s too early to tell if the simulations will be able to match all 153 examples of the universal law in McGaugh and his colleagues’ galaxy data set. If not, then the standard dark matter paradigm is in big trouble. “Obviously this is something that the community needs to look at more carefully,” Zurek said.

    Even if the simulations can be made to match the data, McGaugh, for one, considers it an implausible coincidence that dark matter and visible matter would conspire to exactly mimic the predictions of MOND at every location in every galaxy. “If somebody were to come to you and say, ‘The solar system doesn’t work on an inverse-square law, really it’s an inverse-cube law, but there’s dark matter that’s arranged just so that it always looks inverse-square,’ you would say that person is insane,” he said. “But that’s basically what we’re asking to be the case with dark matter here.”

    Given the considerable indirect evidence and near consensus among physicists that dark matter exists, it still probably does, Zurek said. “That said, you should always check that you’re not on a bandwagon,” she added. “Even though this paradigm explains everything, you should always check that there isn’t something else going on.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

  • richardmitnick 1:28 pm on November 28, 2016 Permalink | Reply
    Tags: Bolometers, Dark Matter, EDELWEISS experiment, IPNL, , ,   

    From IPNL: “First Results of the EDELWEISS III experiment” 


    Institut de Physique Nucleaire de Lyon

    13 May, 2016 [I just found these guys]
    Cazes Antoine

    EDELWEISS bolometers before installation. No image credit

    The EDELWEISS experiment aims to detect WIMPs, a candidate for dark matter particles. It is located in the Modane Underground Laboratory. The experiment operates bolometers cooled to tens of Millikelvin where a WIMP might collide with a germanium nucleus and produce its recoil. This recoil is then measured by the resulting temperature rise (few microkelvin) and ionisation production (Germanium is a semiconductor material). This double measurment allow to identify nuclear recoils and thus eliminating much of the background due to gamma rays from natural radioactivity.

    For the third phase of the EDELWEISS experiment, the bolometers have been greatly improved and the cryostat was redesigned to reduce background noise and to accommodate a larger mass of detector. The experiment ran from July 2014 to April 2015. The data, equivalent to 582 kg.days were blindly analyzed and the background rejection was performed using a Boosted Decision Tree. This results in a lack of detection of WIMP and an improvement, by a factor varying between 12 to 41, compared to the previous limit EDELWEISS II: for a WIMP 5 GeV / c2, the collision cross sections WIMP -nucléon above 4.3×10-40 cm2 are excluded and those above 9.4×10-44 cm2 for WIMPs 20 GeV / c2.

    The EDELWEISS experiment is now working on a major R & D with the aim of lowering bolometers detection thresholds to explore collisions with low-mass WIMP (below 5 GeV / c2). This work is carried out in particular with the IOL cryostat installed IPNL.

    Science paper:
    Constraints on low-mass WIMPs from the EDELWEISS-III dark matter search

    See the full article here .


  • richardmitnick 2:40 pm on November 25, 2016 Permalink | Reply
    Tags: , Dark Matter, GridPP, , Shear brilliance: computing tackles the mystery of the dark universe,   

    From U Manchester: “Shear brilliance: computing tackles the mystery of the dark universe” 

    U Manchester bloc

    University of Manchester

    24 November 2016
    No writer credit found

    Scientists from The University of Manchester working on a revolutionary telescope project have harnessed the power of distributed computing from the UK’s GridPP collaboration to tackle one of the Universe’s biggest mysteries – the nature of dark matter and dark energy.

    Researchers at The University of Manchester have used resources provided by GridPP – who represent the UK’s contribution to the computing grid used to find the Higgs boson at CERN – to run image processing and machine learning algorithms on thousands of images of galaxies from the international Dark Energy Survey.

    Dark Energy Icon

    The Manchester team are part of the collaborative project to build the Large Synoptic Survey Telescope (LSST), a new kind of telescope currently under construction in Chile and designed to conduct a 10-year survey of the dynamic Universe. LSST will be able to map the entire visible sky.

    LSST/Camera, built at SLAC
    LSST/Camera, built at SLAC

    LSST Interior
    LSST telescope, currently under construction at Cerro Pachón Chile
    LSST telescope, currently under construction at Cerro Pachón Chile

    In preparation to the LSST starting its revolutionary scanning, a pilot research project has helped researchers detect and map out the cosmic shear seen across the night sky, one of the tell-tale signs of the dark matter and dark energy thought to make up some 95 per cent of what we see in the Universe. This in turn will help prepare for the analysis of the expected 200 petabytes of data the LSST will collect when it starts operating in 2023.

    The pilot research team based at The Manchester of University was led by Dr Joe Zuntz, a cosmologist originally at Manchester’s Jodrell Bank Observatory and now a researcher at the Royal Observatory in Edinburgh.

    “Our overall aim is to tackle the mystery of the dark universe – and this pilot project has been hugely significant. When the LSST is fully operating researchers will face a galactic data deluge – and our work will prepare us for the analytical challenge ahead.”
    Sarah Bridle, Professor of Astrophysics

    Dr George Beckett, the LSST-UK Science Centre Project Manager based at The University of Edinburgh, added: “The pilot has been a great success. Having completed the work, Joe and his colleagues are able to carry out shear analysis on vast image sets much faster than was previously the case. Thanks are due to the members of the GridPP community for their assistance and support throughout.”

    The LSST will produce images of galaxies in a wide variety of frequency bands of the visible electromagnetic spectrum, with each image giving different information about the galaxy’s nature and history. In times gone by, the measurements needed to determine properties like cosmic shear might have been done by hand, or at least with human-supervised computer processing.

    With the billions of galaxies expected to be observed by LSST, such approaches are unfeasible. Specialised image processing and machine learning software (Zuntz 2013) has therefore been developed for use with galaxy images from telescopes like LSST and its predecessors. This can be used to produce cosmic shear maps like those shown in the figure below. The challenge then becomes one of processing and managing the data for hundreds of thousands of galaxies and extracting scientific results required by LSST researchers and the wider astrophysics community.

    As each galaxy is essentially independent of other galaxies in the catalogue, the image processing workflow itself is highly parallelisable. This makes it an ideal problem to tackle with the kind of High-Throughput Computing (HTP) resources and infrastructure offered by GridPP. In many ways, the data from CERN’s Large Hadron Collider particle collision events is like that produced by a digital camera (indeed, pixel-based detectors are used near the interaction points) – and GridPP regularly processes billions of such events as part of the Worldwide LHC Computing Grid (WLCG).

    A pilot exercise, led by Dr Joe Zuntz while at The University of Manchester and supported by one of the longest serving and most experienced GridPP experts, Senior System Administrator Alessandra Forti, saw the porting of the image analysis workflow to GridPP’s distributed computing infrastructure. Data from the Dark Energy Survey (DES) was used for the pilot.

    After transferring this data from the US to GridPP Storage Elements, and enabling the LSST Virtual Organisation on a number of GridPP Tier-2 sites, the IM3SHAPE analysis software package (Zuntz, 2013) was tested on local, grid-friendly client machines to ensure smooth running on the grid. Analysis jobs were then submitted and managed using the Ganga software suite, which is able to coordinate the thousands of individual analyses associated with each batch of galaxies. Initial runs were submitted using Ganga to local grid sites, but the pilot progressed to submission to multiple sites via the GridPP DIRAC (Distributed Infrastructure with Remote Agent Control) service. The flexibility of Ganga allows both types of submission, which made the transition from local to distributed running significantly easier.

    By the end of pilot, Dr Zuntz was able to run the image processing workflow on multiple GridPP sites, regularly submitting thousands of analysis jobs on DES images.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U Manchester campus

    The University of Manchester (UoM) is a public research university in the city of Manchester, England, formed in 2004 by the merger of the University of Manchester Institute of Science and Technology (renamed in 1966, est. 1956 as Manchester College of Science and Technology) which had its ultimate origins in the Mechanics’ Institute established in the city in 1824 and the Victoria University of Manchester founded by charter in 1904 after the dissolution of the federal Victoria University (which also had members in Leeds and Liverpool), but originating in Owens College, founded in Manchester in 1851. The University of Manchester is regarded as a red brick university, and was a product of the civic university movement of the late 19th century. It formed a constituent part of the federal Victoria University between 1880, when it received its royal charter, and 1903–1904, when it was dissolved.

    The University of Manchester is ranked 33rd in the world by QS World University Rankings 2015-16. In the 2015 Academic Ranking of World Universities, Manchester is ranked 41st in the world and 5th in the UK. In an employability ranking published by Emerging in 2015, where CEOs and chairmen were asked to select the top universities which they recruited from, Manchester placed 24th in the world and 5th nationally. The Global Employability University Ranking conducted by THE places Manchester at 27th world-wide and 10th in Europe, ahead of academic powerhouses such as Cornell, UPenn and LSE. It is ranked joint 56th in the world and 18th in Europe in the 2015-16 Times Higher Education World University Rankings. In the 2014 Research Excellence Framework, Manchester came fifth in terms of research power and seventeenth for grade point average quality when including specialist institutions. More students try to gain entry to the University of Manchester than to any other university in the country, with more than 55,000 applications for undergraduate courses in 2014 resulting in 6.5 applicants for every place available. According to the 2015 High Fliers Report, Manchester is the most targeted university by the largest number of leading graduate employers in the UK.

    The university owns and operates major cultural assets such as the Manchester Museum, Whitworth Art Gallery, John Rylands Library and Jodrell Bank Observatory which includes the Grade I listed Lovell Telescope.

  • richardmitnick 1:14 pm on November 25, 2016 Permalink | Reply
    Tags: , Dark Matter, , NA64 experiment hunts the mysterious dark photon, ,   

    From CERN: “NA64 hunts the mysterious dark photon” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead


    25 Nov 2016
    Stefania Pandolfi
    Posted by Corinne Pralavorio

    An overview of the NA64 experimental set-up at CERN. NA64 hunts down dark photons, hypothetic dark matter particles. (Image: Maximilien Brice/CERN)

    One of the biggest puzzles in physics is that eighty-five percent of the matter in our universe is “dark”: it does not interact with the photons of the conventional electromagnetic force and is therefore invisible to our eyes and telescopes. Although the composition and origin of dark matter are a mystery, we know it exists because astronomers observe its gravitational pull on ordinary visible matter such as stars and galaxies.

    Some theories suggest that, in addition to gravity, dark matter particles could interact with visible matter through a new force, which has so far escaped detection. Just as the electromagnetic force is carried by the photon, this dark force is thought to be transmitted by a particle called “dark” photon which is predicted to act as a mediator between visible and dark matter.

    “To use a metaphor, an otherwise impossible dialogue between two people not speaking the same language (visible and dark matter) can be enabled by a mediator (the dark photon), who understands one language and speaks the other one,” explains Sergei Gninenko, spokesperson for the NA64 collaboration.

    CERN’s NA64 experiment looks for signatures of this visible-dark interaction using a simple but powerful physics concept: the conservation of energy. A beam of electrons, whose initial energy is known very precisely, is aimed at a detector. Interactions between incoming electrons and atomic nuclei in the detector produce visible photons. The energy of these photons is measured and it should be equivalent to that of the electrons. However, if the dark photons exist, they will escape the detector and carry away a large fraction of the initial electron energy.

    Therefore, the signature of the dark photon is an event registered in the detector with a large amount of “missing energy” that cannot be attributed to a process involving only ordinary particles, thus providing a strong hint of the dark photon’s existence.

    If confirmed, the existence of the dark photon would represent a breakthrough in our understanding the longstanding dark matter mystery.

    View of the NA64 experiment set-up. (Video: Christoph Madsen/Noemi Caraban/CERN)

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Cern Courier




    CERN CMS New

    CERN LHCb New II


    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

  • richardmitnick 12:14 pm on November 8, 2016 Permalink | Reply
    Tags: , Dark Matter, ,   

    From Symmetry: “The origins of dark matter” 

    Symmetry Mag

    Matthew R. Francis

    Illustration by Sandbox Studio, Chicago with Corinne Mucha

    Theorists think dark matter was forged in the hot aftermath of the Big Bang.

    Transitions are everywhere we look. Water freezes, melts, or boils; chemical bonds break and form to make new substances out of different arrangements of atoms.

    The universe itself went through major transitions in early times. New particles were created and destroyed continually until things cooled enough to let them survive.

    CMB per ESA/Planck
    “CMB per ESA/Planck

    Those particles include ones we know about, such as the Higgs boson or the top quark.

    CERN CMS Higgs Event
    CERN CMS Higgs Event

    CERN ATLAS Higgs Event
    CERN ATLAS Higgs Event

    But they could also include dark matter, invisible particles which we presently know only because of their gravitational effects.

    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al
    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al.

    In cosmic terms, dark matter particles could be a “thermal relic,” forged in the hot early universe and then left behind during the transitions to more moderate later eras. One of these transitions, known as “freeze-out,” changed the nature of the whole universe.

    The hot cosmic freezer

    On average, today’s universe is a pretty boring place. If you pick a random spot in the cosmos, it’s far more likely to be in intergalactic space than, say, the heart of a star or even inside an alien solar system. That spot is probably cold, dark and quiet.

    The same wasn’t true for a random spot shortly after the Big Bang.

    “The universe was so hot that particles were being produced from photons smashing into other photons, of photons hitting electrons, and electrons hitting positrons and producing these very heavy particles,” says Matthew Buckley of Rutgers University.

    The entire cosmos was a particle-smashing party, but parties aren’t meant to last. This one lasted only a trillionth of a second. After that came the cosmic freeze-out.

    During the freeze-out, the universe expanded and cooled enough for particles to collide far less frequently and catastrophically.

    “One of these massive particles floating through the universe is finding fewer and fewer antimatter versions of itself to collide with and annihilate,” Buckley says.

    “Eventually the universe would get large enough and cold enough that the rate of production and the rate of annihilation basically goes to zero, and you just a relic abundance, these few particles that are floating out there lonely in space.”

    Many physicists think dark matter is a thermal relic, created in huge numbers in before the cosmos was a half-second old and lingering today because it barely interacts with any other particle.

    A WIMPy miracle

    One reason to think of dark matter as a thermal relic is an interesting coincidence known as the “WIMP miracle.”

    WIMP stands for “weakly-interacting massive particle,” and WIMPs are the most widely accepted candidates for dark matter. Theory says WIMPs are likely heavier than protons and interact via the weak force, or at least interactions related to the weak force.

    The last bit is important, because freeze-out for a specific particle depends on what forces affect it and the mass of the particle. Thermal relics made by the weak force were born early in the universe’s history because particles need to be jammed in tight for the weak force, which only works across short distances, to be a factor.

    “If dark matter is a thermal relic, you can calculate how big the interaction [between dark matter particles] needs to be,” Buckley says.

    Both the primordial light known as the cosmic microwave background and the behavior of galaxies tell us that most dark matter must be slow-moving (“cold” in the language of physics). That means interactions between dark matter particles must be low in strength.

    “Through what is perhaps a very deep fact about the universe,” Buckley says, “that interaction turns out to be the strength of what we know as the weak nuclear force.”

    That’s the WIMP miracle: The numbers are perfect to make just the right amount of WIMPy matter.

    The big catch, though, is that experiments haven’t found any WIMPs yet. It’s too soon to say WIMPs don’t exist, but it does rule out some of the simpler theoretical predictions about them.

    Ultimately, the WIMP miracle could just be a coincidence. Instead of the weak force, dark matter could involve a new force of nature that doesn’t affect ordinary matter strongly enough to detect. In that scenario, says Jessie Shelton of the University of Illinois at Urbana-Champaign, “you could have thermal freeze-out, but the freeze-out is of dark matter to some other dark field instead of [something in] the Standard Model.”

    In that scenario, dark matter would still be a thermal relic but not a WIMP.

    For Shelton, Buckley, and many other physicists, the dark matter search is still full of possibilities.

    “We have really compelling reasons to look for thermal WIMPs,” Shelton says. “It’s worth remembering that this is only one tiny corner of a much broader space of possibilities.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 1:47 pm on November 2, 2016 Permalink | Reply
    Tags: , , Dark Matter, , , Quantum chromodynamics (QCD), that if axions do make up the bulk of dark matter   

    From phys.org- “Supercomputer comes up with a profile of dark matter: Standard Model extension predicts properties of candidate particle” 


    November 2, 2016

    Simulated distribution of dark matter approximately three billion years after the Big Bang (illustration not from this work). Credit: The Virgo Consortium/Alexandre Amblard/ESA

    In the search for the mysterious dark matter, physicists have used elaborate computer calculations to come up with an outline of the particles of this unknown form of matter. To do this, the scientists extended the successful Standard Model of particle physics which allowed them, among other things, to predict the mass of so-called axions, promising candidates for dark matter. The German-Hungarian team of researchers led by Professor Zoltán Fodor of the University of Wuppertal, Eötvös University in Budapest and Forschungszentrum Jülich carried out its calculations on

    “Dark matter is an invisible form of matter which until now has only revealed itself through its gravitational effects. What it consists of remains a complete mystery,” explains co-author Dr Andreas Ringwald, who is based at DESY and who proposed the current research. Evidence for the existence of this form of matter comes, among other things, from the astrophysical observation of galaxies, which rotate far too rapidly to be held together only by the gravitational pull of the visible matter. High-precision measurements using the European satellite “Planck” show that almost 85 percent of the entire mass of the universe consists of dark matter. All the stars, planets, nebulae and other objects in space that are made of conventional matter account for no more than 15 percent of the mass of the universe.

    “The adjective ‘dark’ does not simply mean that it does not emit visible light,” says Ringwald. “It does not appear to give off any other wavelengths either – its interaction with photons must be very weak indeed.” For decades, physicists have been searching for particles of this new type of matter. What is clear is that these particles must lie beyond the Standard Model of particle physics, and while that model is extremely successful, it currently only describes the conventional 15 percent of all matter in the cosmos. From theoretically possible extensions to the Standard Model physicists not only expect a deeper understanding of the universe, but also concrete clues in what energy range it is particularly worthwhile looking for dark-matter candidates.

    The unknown form of matter can either consist of comparatively few, but very heavy particles, or of a large number of light ones. The direct searches for heavy dark-matter candidates using large detectors in underground laboratories and the indirect search for them using large particle accelerators are still going on, but have not turned up any dark matter particles so far. A range of physical considerations make extremely light particles, dubbed axions, very promising candidates. Using clever experimental setups, it might even be possible to detect direct evidence of them. “However, to find this kind of evidence it would be extremely helpful to know what kind of mass we are looking for,” emphasises theoretical physicist Ringwald. “Otherwise the search could take decades, because one would have to scan far too large a range.”

    The existence of axions is predicted by an extension to quantum chromodynamics (QCD), the quantum theory that governs the strong interaction, responsible for the nuclear force. The strong interaction is one of the four fundamental forces of nature alongside gravitation, electromagnetism and the weak nuclear force, which is responsible for radioactivity. “Theoretical considerations indicate that there are so-called topological quantum fluctuations in quantum chromodynamics, which ought to result in an observable violation of time reversal symmetry,” explains Ringwald. This means that certain processes should differ depending on whether they are running forwards or backwards. However, no experiment has so far managed to demonstrate this effect.

    The extension to quantum chromodynamics (QCD) restores the invariance of time reversals, but at the same time it predicts the existence of a very weakly interacting particle, the axion, whose properties, in particular its mass, depend on the strength of the topological quantum fluctuations. However, it takes modern supercomputers like Jülich’s JUQUEEN to calculate the latter in the temperature range that is relevant in predicting the relative contribution of axions to the matter making up the universe. “On top of this, we had to develop new methods of analysis in order to achieve the required temperature range,” notes Fodor who led the research.

    The results show, among other things, that if axions do make up the bulk of dark matter, they should have a mass of 50 to 1500 micro-electronvolts, expressed in the customary units of particle physics, and thus be up to ten billion times lighter than electrons. This would require every cubic centimetre of the universe to contain on average ten million such ultra-lightweight particles. Dark matter is not spread out evenly in the universe, however, but forms clumps and branches of a weblike network. Because of this, our local region of the Milky Way should contain about one trillion axions per cubic centimetre.

    Thanks to the Jülich supercomputer, the calculations now provide physicists with a concrete range in which their search for axions is likely to be most promising. “The results we are presenting will probably lead to a race to discover these particles,” says Fodor. Their discovery would not only solve the problem of dark matter in the universe, but at the same time answer the question why the strong interaction is so surprisingly symmetrical with respect to time reversal. The scientists expect that it will be possible within the next few years to either confirm or rule out the existence of axions experimentally.

    The Institute for Nuclear Research of the Hungarian Academy of Sciences in Debrecen, the Lendület Lattice Gauge Theory Research Group at the Eötvös University, the University of Zaragoza in Spain, and the Max Planck Institute for Physics in Munich were also involved in the research.

    S. Borsanyi et al, Calculation of the axion mass based on high-temperature lattice quantum chromodynamics, Nature (2016). DOI: 10.1038/nature20115

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Phys.org in 100 Words

    Phys.org™ (formerly Physorg.com) is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004, Phys.org’s readership has grown steadily to include 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page. set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

  • richardmitnick 2:29 pm on October 28, 2016 Permalink | Reply
    Tags: , Dark Matter, , Why Doesn't Dark Matter Form Black Holes?   

    From Ethan Siegel: “Why Doesn’t Dark Matter Form Black Holes?” 

    Ethan Siegel

    Black Holes Could Turn You Into a Hologram, and You Wouldn’t Even Notice

    An illustration of a black hole. Despite how dark it is, all black holes are thought to have formed from normal matter, not dark matter. Image credit: NASA/JPL-Caltech.

    Dark matter is the most abundant form of mass in our Universe. If you were to add up all the stars, planets, lifeforms, gas, dust, plasma and more — all the known, “normal” matter in our Universe — it would only account for about 15-to-17% of the total gravitation that we see. The remaining mass, outclassing the normal matter by a 5:1 ratio, must be completely invisible, meaning it doesn’t absorb or emit light at all. Yet it must interact gravitationally, enabling it to form large-scale structure in the Universe and to hold galaxies together. So why, then, can’t it form black holes?

    Black holes aren’t the only thing dark matter can’t form; it also can’t create dark matter stars, planets or dark atoms. Imagine the Universe as it might have been back in the very, very early stages, before there were any black holes, stars, planets or atoms.

    The early Universe was full of matter and radiation, and was so hot and dense that the quarks and gluons present didn’t form into individual protons and neutrons, but remained in a quark-gluon plasma. (Image credit: RHIC collaboration, Brookhaven, via http://www.bnl.gov/newsroom/news.php?a=11403)

    All we had was a hot, dense, expanding “sea” of matter and radiation of all the different types allowed. By time the Universe has aged to be a few minutes old, the atomic nuclei are there, all the electrons are there, all the neutrinos and photons are there, and all the dark matter is there, too.

    They’re all flying around at incredible speeds, sure, but they’re also all exerting forces on one another. It’s true that they all feel the gravitational force (even photons, thanks to Einstein’s energy-mass equivalence), but gravity isn’t the only thing that matters here.

    In the hot, early Universe, prior to the formation of neutral atoms, photons scatter off of electrons (and to a lesser extent, protons) at a very high rate, transferring momentum when they do. (Images credit: Amanda Yoho)

    hotons and electrons have it the worst: they interact very frequently through the electromagnetic force, scattering and “bouncing” off of one another, exchanging energy, momentum and colliding at an alarming rate. Nuclei fare only a little better: they’re much more massive, so their interaction rate is lower, and they pick up (or lose) less momentum with each collision.

    Neutrinos are much luckier: they don’t have an electric charge, and so they don’t interact through the electromagnetic force at all. Instead, they can only interact (besides gravity) through the weak force, which means collisions are incredibly infrequent. But dark matter gets it the best in terms of freedom: as far as we can tell, it only interacts through gravity. There are no collisions at all, and so all dark matter can do is be attracted to the other sources of matter.

    Access mp4 video here .

    This might, you worry, make things worse! While normal matter has collisions and interactions preventing it from collapsing gravitationally, forming denser clumps, etc., the dark matter density begins to grow in the overdense regions. But this doesn’t happen the way you think of “collapse” happening. When a gas cloud collapses to form stars, what happens?

    A massive, gaseous nebula is where new stars in the Universe are born. (Image credit: ESO/VPHAS+ team, via http://www.eso.org/public/images/eso1403a/)

    The gas interacts through the gravitational force, becoming denser, but the matter that makes up that gas sticks together, allowing it to reach a denser state. That “stickiness” only happens thanks to the electromagnetic force! This is why things can collapse down to produce bound objects like stars, planets and even atoms.

    Without that stickiness? You’d just end up with a diffuse, loosely held together, “fluffy” structure bound together only through gravity. That’s why you hear of dark matter halos on galaxy and cluster scales, of dark matter filaments on even larger scales, and of no other dark matter structures.

    Now, these diffuse, fluffy halos are incredibly important: they represent the seeds of all the bound structure in the Universe today. This includes dwarf galaxies, normal galaxies, galaxy groups, galaxy clusters, superclusters and filaments, as well as all the substructure that makes these objects up. But without that extra force — without some “sticky” force to hold it together, to exchange energy and momentum — the dark matter is destined to remain in this fluffy, diffuse state. The normal matter can form the tightly-bound structures you’re used to, but the dark matter has no way to collide inelastically, to lose momentum or angular momentum, and hence, it has to remain loosely bound and “halo-like.”

    While stars might cluster in the disk and the normal matter might be restricted to a nearby region around the stars, dark matter extends in a halo more than 10 times the extent of the luminous portion. (Image credit: ESO/L. Calçada)

    It’s a little disconcerting to think that it’s not the gravitational force that leads to planets, stars, black holes and more, but gravity is just part of the equation. To really drive this point home, imagine that you took a ball of some type and launched it, with the ball — as you know — made out of atoms. What’s the ball going to do?

    A projectile under the influence of gravity will move in a parabola, until it strikes other matter (like the floor) that prevents it from moving further. (Image credit: Wikimedia Commons users MichaelMaggs Edit by Richard Bartz under c.c.a.-s.a.-3.0)

    Of course, it’ll move in a parabolic path (neglecting air resistance), rising up to a maximum height and falling down until it finally strikes the Earth. On a more fundamental scale, the ball moves in an elliptical orbit with the center-of-mass of the Earth as one focus of the ellipse, but the ground gets in the way of that ellipse, and so the portion we see looks like a parabola. But if you magically turned that ball into a clump of dark matter, what you’d get would surprise you greatly.

    Normal matter is stopped by the Earth, but dark matter would pass right through, making a near-perfect ellipse. (Image credit: Dave Goldberg of Ask A Mathematician/Ask A Physicist, via http://www.askamathematician.com/2012/01/q-why-does-gravity-make-some-things-orbit-and-some-things-fall/)

    Without the electromagnetic force, a whole bunch of terrible things happen:

    There’s no interaction, other than gravity, between the particles making up the ball and the atoms of the Earth. Instead of making a parabola, the dark matter clump goes all the way through the layers of the Earth, swinging around the center in an (almost-perfect) ellipse (but not quite, due to the layers and non-uniform density of the Earth), coming out near where it entered, making a parabola again and continue to orbit like that interminably.
    There are also no interactions holding this clump together! So while atoms in a ball do have some random motions, they are held together by the electromagnetic force, keeping that ball-like structure to it. But if you remove that electromagnetic force, the random motions of the dark matter particles will work to unbind this from being a clump, since the gravitation of the clump itself is insufficient to keep it bound together.

    This means that over time (and many orbits), the dark matter gets stretched into a long ellipse, and that ellipse gets more and more diffuse, similar to the particles that make up the debris stream from a comet, only even more diffuse!

    (Image credit: Gehrz, R. D., Reach, W. T., Woodward, C. E., and Kelley, M. S., 2006, of the trail of Comet Encke)

    Dark matter can’t form black holes or other tightly-bound structures because gravity alone isn’t enough to bind something tightly together. Because the force of gravity is so weak, it can only bind it loosely, which means huge, diffuse, very massive structures. If you want a “clump” of something — a star, a planet, or even an atom — you need a force that’s stronger than gravity to make it happen.

    There may yet be one! It is possible that dark matter self-interacts (or interacts with matter or radiation, at some level), but if it does, we only have constraints on how weak that interaction is. And it is very, very weak, if it’s even non-zero at all.

    If dark matter does have a self-interaction, its cross-section is tremendously low, as direct detection experiments have shown. (Image credit: Mirabolfathi, Nader arXiv:1308.0044 [astro-ph.IM], via https://inspirehep.net/record/1245953/plots)

    So even though we think of gravity as the only force that matters on the largest scales, the truth is when we think about the structures that we see — the ones that give off light, that house atoms and molecules, that collapse into black holes — it’s the other forces, in concert with gravity, that allow them to exist at all. You need some type of inelastic, sticky collision, and dark matter doesn’t have the right interactions to make that possible. Because of that, dark matter can’t make a galaxy, a star, a planet or a black hole. It takes more than gravity alone to do the job.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

  • richardmitnick 9:52 am on October 19, 2016 Permalink | Reply
    Tags: , , Dark Matter, , No Number Of Additional Galaxies Can Prevent The Universe From Needing Dark Matter   

    From Ethan Siegel: “No Number Of Additional Galaxies Can Prevent The Universe From Needing Dark Matter” 

    Ethan Siegel

    The Hubble eXtreme Deep Field (XDF), which revealed approximately 50% more galaxies-per-square-degree than the previous Ultra-Deep Field. Image credit: NASA; ESA; G. Illingworth, D. Magee, and P. Oesch, University of California, Santa Cruz; R. Bouwens, Leiden University; and the HUDF09 Team.

    It was perhaps the biggest news in space since we detected gravitational waves: instead of “billions and billions” of galaxies, there are at least two trillion of them — that’s 2,000,000,000,000 — within our observable Universe. Previously, the best estimate was merely 170 billion, coming from galaxy counts informed by the deepest observations of the Hubble Space Telescope. You might wonder, with more than ten times the galaxies present than we’d previously thought, whether this means that dark matter might not be necessary after all. Let’s see what the science has to say.

    The different shapes, structures and morphologies of some of the galaxies in Hickson Compact Group 59 show evidence for a wide variety of stars, plus gas, plasma and dust as well. Image credit: ESA/Hubble and NASA.

    If you take a look at stars, galaxies or clusters of galaxies in the nearby Universe, you can gather all the light available over the full set of wavelengths covering the electromagnetic spectrum. Because astronomers think we know how stars work, by measuring all of that light, we can calculate how much mass is present in the form of stars. This is one form of normal matter: matter made up of protons, neutrons and electrons. But stars aren’t all of it; there are plenty of other sources as well, like gas, dust, plasma, planets and black holes.

    A multiwavelength view of the Milky Way reveals the presence of many different phases and states of normal matter, far beyond the stars we’re used to seeing in visible light. Image credit: NASA.

    Each of them leave their own signature and each has its own methods to constrain or detect its presence and abundance. You might think that adding all of these different components together is how we get an estimate for the amount of matter in the Universe, but that’s actually a horrible approach, and not how we do it at all. Instead, there are three separate, independent signatures that measure the total normal matter content of the Universe all at once.

    An illustration of clustering patterns due to Baryon Acoustic Oscillations. Image credit: Zosia Rostomian.

    One is to look at the clustering data of all the different galaxies we observe. If you put your finger on one galaxy and ask, “how likely am I to find a galaxy at a particular distance away,” you’ll find a nice, smooth distribution as you increase that distance. But thanks to normal matter, there’s an increased likelihood of finding a galaxy that’s 500 million light years away versus finding one that’s either 400 or 600 million light years. The amount of normal matter present determines this distance, and thanks to this technique, we get a very particular number for the amount of normal matter: about 5% of the critical density.

    The fluctuations in the Cosmic Microwave Background, or the Big Bang’s leftover glow, contain a plethora of information about what’s encoded in the Universe’s history. Image credit: ESA and the Planck Collaboration.

    A second is to look at the fluctuations in the cosmic microwave background. The Big Bang’s leftover glow is one of the best signals we have from the young Universe to piece together what it was like in the distant past. While this map of the slightly hotter and cooler spots might look like random fluctuations to the naked eye, the fluctuations are larger than average on a very specific scale — about 0.5º — that corresponds to a very particular density of normal matter in the Universe. That density? About 5% of the critical density, the same as from the first method.

    An ultra-distant quasar will encounter gas clouds on the light’s journey to Earth, with some of the most distant clouds containing ultra-pristine gas that has never formed stars. Image credit: Ed Janssen, ESO.

    And finally, you can look at the earliest matter you can observe: pristine clouds of gas that have never formed a single star. Stars don’t form everywhere in the Universe at once, so if you can find an ultra-bright galaxy or a quasar that emits light from when the Universe was less than one billion years old, you might get lucky enough to find an intervening cloud of gas that absorbs some of that light. Those absorption features tell you what elements are present and in what abundance, and that in turn tells you how much normal matter must be present in the Universe to form those ratios of elements like hydrogen, deuterium, helium-3, helium-4 and lithium-7. The result from all this data? A Universe with about 5% of the critical density in the form of normal matter.

    The predicted abundances of helium-4, deuterium, helium-3 and lithium-7 as predicted by Big Bang Nucleosynthesis, with observations shown in the red circles. Image credit: NASA/WMAP Science Team.

    The fact that these three wildly independent methods all give the same answer for the density of normal matter is a particularly compelling argument that we know how much normal matter is in the Universe. When you hear a story about more stars, galaxies, gas or plasma being found in the Universe, that’s good, because it helps us understand where that 5% is located and how it’s distributed. More stars might mean less gas; more plasma might mean less dust; more planets and brown dwarfs might mean fewer black holes. But it can’t encroach on the other 27% that dark matter makes up, or the other 68% that dark energy composes.

    The percentages of normal matter, dark matter and dark energy in the Universe, as measured by our best cosmic probes before (L) and after (R) the first results of the Planck mission. Image credit: ESA and the Planck Collaboration.

    Those same sources of data that tell us the normal matter density — plus many others — can all be combined to paint a single cohesive picture of the Universe: 68% dark energy, 27% dark matter and 5% normal matter, with no more than 0.1% of anything else like neutrinos, photons or gravitational waves. It’s important to remember that the “5% normal matter” doesn’t just include stars or other light-emitting forms of matter, but rather everything that’s composed of protons, neutrons and electrons in the entire Universe. More stars, more galaxies or more sources of light might be a remarkably interesting discovery, but it doesn’t mean that we don’t need dark matter. In fact, to obtain the Universe as we observe it to be, dark matter is an indispensable ingredient.

    Access mp4 video here .

    The discovery that there are more galaxies than we’d ever known before better informs us how the matter we have is distributed, but does nothing to change what the matter itself fundamentally is. We’re still on the hunt for exactly what the nature of dark matter and dark energy are, to be sure. From a cosmic perspective, not only don’t these new observations change our picture of what’s out there, but in order for dark matter and dark energy to be wrong, something would have to be off about what we’ve already seen. Nevertheless, we have no choice but to keep looking. The mysteries of nature might not yield easily, but neither does human curiosity.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

  • richardmitnick 6:33 pm on October 17, 2016 Permalink | Reply
    Tags: Dark Matter, , ,   

    From SURF: “LUX: The end of an era” 

    SURF logo
    Sanford Underground levels

    Sanford Underground Research facility

    October 17, 2016
    Constance Walter

    The top of the the LUX detector can be seeen emerging from the water tank. From Left Doug Tiedt, Wei Ji, and Ken Wilson work on the removal.
    Credit: Matthew Kapust

    Five years ago, the Large Underground Xenon (LUX) experiment began its long journey to the Davis Cavern on the 4850 Level of Sanford Lab. Results published in 2013 proved LUX to be the most sensitive dark matter experiment in the world. When LUX completed its 300-live-day run in May of this year, the world learned LUX was even more sensitive than previously determined.

    Earlier this month, the LUX collaboration began decommissioning the experiment. “It’s bittersweet, the end of an era, but it was time,” said Simon Fiorruci, a LUX collaborator from Lawrence Berkeley National Laboratory.

    “The detector delivered everything we promised in sensitivity and then went even further,” said Rick Gaitskell, physics professor at Brown University and a co-spokesperson for LUX. “So there is great pride, but also sadness to see an old friend being pensioned off. Of course, the success of LUX acted as an important pathfinder for the larger LZ experiment.”

    LZ (LUX-ZEPLIN), the second-generation dark matter detector, will hold 30 times more xenon and be 100 times more sensitive than LUX.

    Lux Zeplin project at SURF
    Lux Zeplin project at SURF

    It will continue the hunt for WIMPs, or weakly interacting massive particles. The top prospects for explaining dark matter are observed only through gravitational effects on galaxies.

    “The nature of dark matter, which comprises 85 percent of all matter in the universe, is one of the most perplexing mysteries in all of contemporary science,” said Harry Nelson, LZ spokesperson and a physics professor at University of California, Santa Barbara. “Just as science has elucidated the nature of familiar matter, LZ will lead science in testing one of the most attractive hypotheses for the nature of dark matter.”

    LZ recently received approval from the Department of Energy that set in motion the build-out of major components and the preparation of the Davis Cavern. But to make way for the new experiment, LUX must be completely uninstalled—with the exception of the water tank in which LZ will be housed.

    “Essentially, we have to do everything we did to build the LUX detector, but in reverse,” Gaitskell said.

    But decommissioning isn’t as simple as pulling the detector vessel out of the 72,000-gallon water tank in which it has resided for four years. The team first had to remove the 370 kg of xenon and prepare it for transport to SLAC National Accelerator Laboratory. Then they disabled the support system and disconnected thousands of cables. Next, the detector was removed from the water tank and readied for its trip to the surface. The vessel will be opened and the parts analyzed for possible use in LZ.

    “By March we should be removing the last table and chair and handing the space over to LZ,” Fiorruci said.

    Construction of LZ will begin in 2017. Operations are expected to begin in 2020.

    “And so, the process of build, operate, and deconstruct begins again,” Gaitskell said.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    About us.
    The Sanford Underground Research Facility in Lead, South Dakota, advances our understanding of the universe by providing laboratory space deep underground, where sensitive physics experiments can be shielded from cosmic radiation. Researchers at the Sanford Lab explore some of the most challenging questions facing 21st century physics, such as the origin of matter, the nature of dark matter and the properties of neutrinos. The facility also hosts experiments in other disciplines—including geology, biology and engineering.

    The Sanford Lab is located at the former Homestake gold mine, which was a physics landmark long before being converted into a dedicated science facility. Nuclear chemist Ray Davis earned a share of the Nobel Prize for Physics in 2002 for a solar neutrino experiment he installed 4,850 feet underground in the mine.

    Homestake closed in 2003, but the company donated the property to South Dakota in 2006 for use as an underground laboratory. That same year, philanthropist T. Denny Sanford donated $70 million to the project. The South Dakota Legislature also created the South Dakota Science and Technology Authority to operate the lab. The state Legislature has committed more than $40 million in state funds to the project, and South Dakota also obtained a $10 million Community Development Block Grant to help rehabilitate the facility.

    In 2007, after the National Science Foundation named Homestake as the preferred site for a proposed national Deep Underground Science and Engineering Laboratory (DUSEL), the South Dakota Science and Technology Authority (SDSTA) began reopening the former gold mine.

    In December 2010, the National Science Board decided not to fund further design of DUSEL. However, in 2011 the Department of Energy, through the Lawrence Berkeley National Laboratory, agreed to support ongoing science operations at Sanford Lab, while investigating how to use the underground research facility for other longer-term experiments. The SDSTA, which owns Sanford Lab, continues to operate the facility under that agreement with Berkeley Lab.

    The first two major physics experiments at the Sanford Lab are 4,850 feet underground in an area called the Davis Campus, named for the late Ray Davis. The Large Underground Xenon (LUX) experiment is housed in the same cavern excavated for Ray Davis’s experiment in the 1960s.
    LUX/Dark matter experiment at SURFLUX/Dark matter experiment at SURF

    In October 2013, after an initial run of 80 days, LUX was determined to be the most sensitive detector yet to search for dark matter—a mysterious, yet-to-be-detected substance thought to be the most prevalent matter in the universe. The Majorana Demonstrator experiment, also on the 4850 Level, is searching for a rare phenomenon called “neutrinoless double-beta decay” that could reveal whether subatomic particles called neutrinos can be their own antiparticle. Detection of neutrinoless double-beta decay could help determine why matter prevailed over antimatter. The Majorana Demonstrator experiment is adjacent to the original Davis cavern.

    Another major experiment, the Long Baseline Neutrino Experiment (LBNE)—a collaboration with Fermi National Accelerator Laboratory (Fermilab) and Sanford Lab, is in the preliminary design stages. The project got a major boost last year when Congress approved and the president signed an Omnibus Appropriations bill that will fund LBNE operations through FY 2014. Called the “next frontier of particle physics,” LBNE will follow neutrinos as they travel 800 miles through the earth, from FermiLab in Batavia, Ill., to Sanford Lab.

    Fermilab LBNE

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: