Tagged: Dark Energy Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:00 pm on September 29, 2017 Permalink | Reply
    Tags: , , , , , Dark Energy, ,   

    From CfA: “New Insights on Dark Energy” 

    Harvard Smithsonian Center for Astrophysics


    Center For Astrophysics

    Inflationary Universe. NASA/WMAP

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

    The universe is not only expanding – it is accelerating outward, driven by what is commonly referred to as “dark energy.”

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    The term is a poetic analogy to label for dark matter, the mysterious material that dominates the matter in the universe and that really is dark because it does not radiate light (it reveals itself via its gravitational influence on galaxies).

    Dark Matter Research

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Scientists studying the cosmic microwave background hope to learn about more than just how the universe grew—it could also offer insight into dark matter, dark energy and the mass of the neutrino.

    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al

    Dark Matter Particle Explorer China

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB deep in Sudbury’s Creighton Mine

    LUX Dark matter Experiment at SURF, Lead, SD, USA

    ADMX Axion Dark Matter Experiment, U Uashington

    Two explanations are commonly advanced to explain dark energy. The first, as Einstein once speculated, is that gravity itself causes objects to repel one another when they are far enough apart (he added this “cosmological constant” term to his equations). The second explanation hypothesizes (based on our current understanding of elementary particle physics) that the vacuum has properties that provide energy to the cosmos for expansion.

    For several decades cosmologies have successfully used a relativistic equation with dark matter and dark energy to explain increasingly precise observations about the cosmic microwave background, the cosmological distribution of galaxies, and other large-scale cosmic features.

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    But as the observations have improved, some apparent discrepancies have emerged. One of the most notable is the age of the universe: there is an almost 10% difference between measurements inferred from the Planck satellite data and those from so-called Baryon Acoustic Oscillation experiments. The former relies on far-infrared and submillimeter measurements of the cosmic microwave background [CMB] and the latter on spatial distribution of visible galaxies.

    BOSS Supercluster Baryon Oscillation Spectroscopic Survey (BOSS)

    CMB per ESA/Planck

    ESA/Planck

    CfA astronomer Daniel Eisenstein was a member of a large consortium of scientists who suggest that most of the difference between these two methods, which sample different components of the cosmic fabric, could be reconciled if the dark energy were not constant in time. The scientists apply sophisticated statistical techniques to the relevant cosmological datasets and conclude that if the dark energy term varied slightly as the universe expanded (though still subject to other constraints), it could explain the discrepancy. Direct evidence for such a variation would be a dramatic breakthrough, but so far has not been obtained. One of the team’s major new experiments, the Dark Energy Spectroscopic Instrument (DESI) Survey…

    LBNL/DESI Dark Energy Spectroscopic Instrument for the Nicholas U. Mayall 4-meter telescope at Kitt Peak National Observatory near Tucson, Ariz, USA

    …could settle the matter. It will map over twenty-five million galaxies in the universe, reaching back to objects only a few billion years after the big bang, and should be completed sometime in the mid 2020’s.

    Reference(s):

    Dynamical Dark Energy in Light of the Latest Observations, Gong-Bo Zhao et al. Nature Astronomy, 1, 627, 2017

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Center for Astrophysics combines the resources and research facilities of the Harvard College Observatory and the Smithsonian Astrophysical Observatory under a single director to pursue studies of those basic physical processes that determine the nature and evolution of the universe. The Smithsonian Astrophysical Observatory (SAO) is a bureau of the Smithsonian Institution, founded in 1890. The Harvard College Observatory (HCO), founded in 1839, is a research institution of the Faculty of Arts and Sciences, Harvard University, and provides facilities and substantial other support for teaching activities of the Department of Astronomy.

    Advertisements
     
  • richardmitnick 2:17 pm on September 26, 2017 Permalink | Reply
    Tags: , , Dark Energy, EGS Collab- Enhanced Geothermal Systems Collaboration, , Listening to the Earth to harness geothermal energy, SIGMA-V,   

    From SURF: “Listening to the Earth to harness geothermal energy “ 

    SURF logo
    Sanford Underground levels

    Sanford Underground Research facility

    September 25, 2017
    Constance Walter

    Geothermal energy has the potential to power 100 million homes in America.

    1
    Hunter Knox and Bill Roggenthen from South Dakota School of Mines lower sensors down a set of holes that were drilled for the kISMET experiment. Matthew Kapust

    As a geophysicist, Hunter Knox has worked all over the world testing bridges, dams and levees, and listening to the sounds of the earth. She even peered into the center of the earth from a volcano in Antarctica at an open connecting lake.

    “I’m a seismologist. It’s what I do.”

    Now, the field coordinator from Sandia National Laboratory (SNL), is setting her sights on Sanford Lab’s 4850 Level, where she’s planning the logistics for SIGMA-V, a project under the auspices of the Enhanced Geothermal Systems Collaboration (EGS Collab).

    Led by Lawrence Berkeley National Laboratory, the EGS Collab recently received a $9 million grant from the Department of Energy to study geothermal systems. It is believed this clean-energy technology could power up to 100 million American homes.

    But before that can happen, more studies need to be done.

    “We need to better understand how fractures created in deep, hard-rock environments can be used to produce geothermal energy,” Knox said.

    Building on data collected from the recent kISMET experiment at Sanford Lab, the collaboration hopes to expand its understanding of the rock stress and incorporate additional equipment to meet the needs of EGS technology.

    “A typical geothermal system mines heat from the earth by extracting steam or hot water,” said Tim Kneafsey, principal investigator for EGS Collab and a staff earth scientist with LBNL. But for that to happen, three things are needed: hot rock, fluid and the ability for fluid to move through rock.

    “These conditions are not met everywhere,” Kneafsey said. “There is a lot of accessible hot rock, but it may be missing the permeability or fluid or both.”

    “We know fracturing rock can be done. But can it be effective for geothermal purposes? We need good, well-monitored field tests of fracturing, particularly in crystalline rock, to better understand that,” he said.

    That’s where SIGMA-V—or Stimulation Investigations for Geothermal Modeling and Analysis—comes in. “SIGMA-V is shorthand for vertical stress,” Kneafsey said.

    The goal of the project is to collect data that will allow the team to create better predictive and geomechanic models that will allow them to better understand the subsurface of the earth. The team will drill two boreholes: one for injection and one for production. Each will be 60 meters long in the direction of the minimum horizontal stress. Six additional monitoring boreholes will contain seismic, electrical and fiber optic sensors.

    When the holes are drilled, the team will place “straddle packers”—a mandrel, or pipe, with two deflated balloons on either end—inside them. Once inside, they will inflate the balloons and flow water down the pipe to create an airtight section. They will continue to pump water until the rock fractures and use the monitoring equipment to listen for acoustic emissions, the sounds that will tell them what is happening within the rock.

    “One of the problems with EGS is that it is difficult to maintain the fracture network,” Knox said. “Since the boreholes are hard to drill in these hot and very hard rocks and the fracture networks can’t be sustained, it is challenging to maintain an adequate heat exchanger to pull the energy out. We want to figure out how to maintain these networks so we can use the heat for energy.”

    And so, she’ll continue to listen to the rock nearly a mile underground and, perhaps, learn the secret to using it for geothermal energy.

    Forging ahead

    Data collected from SIGMA-V will be applied toward the Frontier Observatory for Research in Geothermal Energy (FORGE), a flagship DOE geothermal project, Kneafsey said. FORGE aims to develop technologies needed to create large-scale, economically sustainable heat exchange systems, thus paving the way for a reproducible approach that will reduce risks associated with EGS development.

    The two FORGE sites are in Fallon, Nevada, which is led by Sandia National Laboratories; and Milford, Utah, led by the University of Utah. The FORGE initiative will include innovative drilling techniques, reservoir stimulation techniques and well connectivity and flow-testing efforts.

    The EGS Collab includes researchers from eight national labs—LBNL, SNL, Lawrence Livermore National Laboratory, Pacific Northwest National Laboratory, Idaho National Laboratory, Los Alamos National Laboratory, National Energy Research Laboratory, and Oak Ridge National Laboratory; and six universities—South Dakota School of Mines and Technology, Stanford, University of Wisconsin, University of Oklahoma, Colorado School of Mines and Penn State.

    Some information for this article was provided by LBNL: http://newscenter.lbl.gov/2017/07/20/berkeley-lab-lead-multimillion-dollar-geothermal-energy-project/

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    About us.
    The Sanford Underground Research Facility in Lead, South Dakota, advances our understanding of the universe by providing laboratory space deep underground, where sensitive physics experiments can be shielded from cosmic radiation. Researchers at the Sanford Lab explore some of the most challenging questions facing 21st century physics, such as the origin of matter, the nature of dark matter and the properties of neutrinos. The facility also hosts experiments in other disciplines—including geology, biology and engineering.

    The Sanford Lab is located at the former Homestake gold mine, which was a physics landmark long before being converted into a dedicated science facility. Nuclear chemist Ray Davis earned a share of the Nobel Prize for Physics in 2002 for a solar neutrino experiment he installed 4,850 feet underground in the mine.

    Homestake closed in 2003, but the company donated the property to South Dakota in 2006 for use as an underground laboratory. That same year, philanthropist T. Denny Sanford donated $70 million to the project. The South Dakota Legislature also created the South Dakota Science and Technology Authority to operate the lab. The state Legislature has committed more than $40 million in state funds to the project, and South Dakota also obtained a $10 million Community Development Block Grant to help rehabilitate the facility.

    In 2007, after the National Science Foundation named Homestake as the preferred site for a proposed national Deep Underground Science and Engineering Laboratory (DUSEL), the South Dakota Science and Technology Authority (SDSTA) began reopening the former gold mine.

    In December 2010, the National Science Board decided not to fund further design of DUSEL. However, in 2011 the Department of Energy, through the Lawrence Berkeley National Laboratory, agreed to support ongoing science operations at Sanford Lab, while investigating how to use the underground research facility for other longer-term experiments. The SDSTA, which owns Sanford Lab, continues to operate the facility under that agreement with Berkeley Lab.

    The first two major physics experiments at the Sanford Lab are 4,850 feet underground in an area called the Davis Campus, named for the late Ray Davis. The Large Underground Xenon (LUX) experiment is housed in the same cavern excavated for Ray Davis’s experiment in the 1960s.
    LUX/Dark matter experiment at SURFLUX/Dark matter experiment at SURF

    In October 2013, after an initial run of 80 days, LUX was determined to be the most sensitive detector yet to search for dark matter—a mysterious, yet-to-be-detected substance thought to be the most prevalent matter in the universe. The Majorana Demonstrator experiment, also on the 4850 Level, is searching for a rare phenomenon called “neutrinoless double-beta decay” that could reveal whether subatomic particles called neutrinos can be their own antiparticle. Detection of neutrinoless double-beta decay could help determine why matter prevailed over antimatter. The Majorana Demonstrator experiment is adjacent to the original Davis cavern.

    Another major experiment, the Long Baseline Neutrino Experiment (LBNE)—a collaboration with Fermi National Accelerator Laboratory (Fermilab) and Sanford Lab, is in the preliminary design stages. The project got a major boost last year when Congress approved and the president signed an Omnibus Appropriations bill that will fund LBNE operations through FY 2014. Called the “next frontier of particle physics,” LBNE will follow neutrinos as they travel 800 miles through the earth, from FermiLab in Batavia, Ill., to Sanford Lab.

    Fermilab LBNE
    LBNE

     
  • richardmitnick 9:55 pm on September 5, 2017 Permalink | Reply
    Tags: , , , , , Dark Energy, , , , , , , ,   

    From Symmetry: “What can particles tell us about the cosmos?” 

    Symmetry Mag
    Symmetry

    09/05/17
    Amanda Solliday

    The minuscule and the immense can reveal quite a bit about each other.

    In particle physics, scientists study the properties of the smallest bits of matter and how they interact. Another branch of physics—astrophysics—creates and tests theories about what’s happening across our vast universe.

    1
    The current theoretical framework that describes elementary particles and their forces, known as the Standard Model, is based on experiments that started in 1897 with the discovery of the electron. Today, we know that there are six leptons, six quarks, four force carriers and a Higgs boson. Scientists all over the world predicted the existence of these particles and then carried out the experiments that led to their discoveries. Learn all about the who, what, where and when of the discoveries that led to a better understanding of the foundations of our universe.

    While particle physics and astrophysics appear to focus on opposite ends of a spectrum, scientists in the two fields actually depend on one another. Several current lines of inquiry link the very large to the very small.

    The seeds of cosmic structure

    For one, particle physicists and astrophysicists both ask questions about the growth of the early universe.

    In her office at Stanford University, Eva Silverstein explains her work parsing the mathematical details of the fastest period of that growth, called cosmic inflation.

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

    “To me, the subject is particularly interesting because you can understand the origin of structure in the universe,” says Silverstein, a professor of physics at Stanford and the Kavli Institute for Particle Astrophysics and Cosmology. “This paradigm known as inflation accounts for the origin of structure in the most simple and beautiful way a physicist can imagine.”

    Scientists think that after the Big Bang, the universe cooled, and particles began to combine into hydrogen atoms. This process released previously trapped photons—elementary particles of light.

    The glow from that light, called the cosmic microwave background, lingers in the sky today.

    CMB per ESA/Planck

    Scientists measure different characteristics of the cosmic microwave background to learn more about what happened in those first moments after the Big Bang.

    According to scientists’ models, a pattern that first formed on the subatomic level eventually became the underpinning of the structure of the entire universe. Places that were dense with subatomic particles—or even just virtual fluctuations of subatomic particles—attracted more and more matter. As the universe grew, these areas of density became the locations where galaxies and galaxy clusters formed. The very small grew up to be the very large.

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Scientists studying the cosmic microwave background hope to learn about more than just how the universe grew—it could also offer insight into dark matter, dark energy and the mass of the neutrino.

    Dark Matter

    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al

    Dark Matter Particle Explorer China

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB deep in Sudbury’s Creighton Mine

    LUX Dark matter Experiment at SURF, Lead, SD, USA

    ADMX Axion Dark Matter Experiment, U Uashington

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA


    FNAL DUNE Argon tank at SURF


    Surf-Dune/LBNF Caverns at Sanford



    SURF building in Lead SD USA

    “It’s amazing that we can probe what was going on almost 14 billion years ago,” Silverstein says. “We can’t learn everything that was going on, but we can still learn an incredible amount about the contents and interactions.”

    For many scientists, “the urge to trace the history of the universe back to its beginnings is irresistible,” wrote theoretical physicist Stephen Weinberg in his 1977 book The First Three Minutes. The Nobel laureate added, “From the start of modern science in the sixteenth and seventeenth centuries, physicists and astronomers have returned again and again to the problem of the origin of the universe.”

    Searching in the dark

    Particle physicists and astrophysicists both think about dark matter and dark energy. Astrophysicists want to know what made up the early universe and what makes up our universe today. Particle physicists want to know whether there are undiscovered particles and forces out there for the finding.

    “Dark matter makes up most of the matter in the universe, yet no known particles in the Standard Model [of particle physics] have the properties that it should possess,” says Michael Peskin, a professor of theoretical physics at SLAC.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    “Dark matter should be very weakly interacting, heavy or slow-moving, and stable over the lifetime of the universe.”

    There is strong evidence for dark matter through its gravitational effects on ordinary matter in galaxies and clusters. These observations indicate that the universe is made up of roughly 5 percent normal matter, 25 percent dark matter and 70 percent dark energy. But to date, scientists have not directly observed dark energy or dark matter.

    “This is really the biggest embarrassment for particle physics,” Peskin says. “However much atomic matter we see in the universe, there’s five times more dark matter, and we have no idea what it is.”

    But scientists have powerful tools to try to understand some of these unknowns. Over the past several years, the number of models of dark matter has been expanding, along with the number of ways to detect it, says Tom Rizzo, a senior scientist at SLAC and head of the theory group.

    Some experiments search for direct evidence of a dark matter particle colliding with a matter particle in a detector. Others look for indirect evidence of dark matter particles interfering in other processes or hiding in the cosmic microwave background. If dark matter has the right properties, scientists could potentially create it in a particle accelerator such as the Large Hadron Collider.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    Physicists are also actively hunting for signs of dark energy. It is possible to measure the properties of dark energy by observing the motion of clusters of galaxies at the largest distances that we can see in the universe.

    “Every time that we learn a new technique to observe the universe, we typically get lots of surprises,” says Marcelle Soares-Santos, a Brandeis University professor and a researcher on the Dark Energy Survey. “And we can capitalize on these new ways of observing the universe to learn more about cosmology and other sides of physics.”

    Forces at play

    Particle physicists and astrophysicists find their interests also align in the study of gravity. For particle physicists, gravity is the one basic force of nature that the Standard Model does not quite explain. Astrophysicists want to understand the important role gravity played and continues to play in the formation of the universe.

    In the Standard Model, each force has what’s called a force-carrier particle or a boson. Electromagnetism has photons. The strong force has gluons. The weak force has W and Z bosons. When particles interact through a force, they exchange these force-carriers, transferring small amounts of information called quanta, which scientists describe through quantum mechanics.

    General relativity explains how the gravitational force works on large scales: Earth pulls on our own bodies, and planetary objects pull on each other. But it is not understood how gravity is transmitted by quantum particles.

    Discovering a subatomic force-carrier particle for gravity would help explain how gravity works on small scales and inform a quantum theory of gravity that would connect general relativity and quantum mechanics.

    Compared to the other fundamental forces, gravity interacts with matter very weakly, but the strength of the interaction quickly becomes larger with higher energies. Theorists predict that at high enough energies, such as those seen in the early universe, quantum gravity effects are as strong as the other forces. Gravity played an essential role in transferring the small-scale pattern of the cosmic microwave background into the large-scale pattern of our universe today.

    “Another way that these effects can become important for gravity is if there’s some process that lasts a long time,” Silverstein says. “Even if the energies aren’t as high as they would need to be sensitive to effects like quantum gravity instantaneously.”

    Physicists are modeling gravity over lengthy time scales in an effort to reveal these effects.

    Our understanding of gravity is also key in the search for dark matter. Some scientists think that dark matter does not actually exist; they say the evidence we’ve found so far is actually just a sign that we don’t fully understand the force of gravity.

    Big ideas, tiny details

    Learning more about gravity could tell us about the dark universe, which could also reveal new insight into how structure in the universe first formed.

    Scientists are trying to “close the loop” between particle physics and the early universe, Peskin says. As scientists probe space and go back further in time, they can learn more about the rules that govern physics at high energies, which also tells us something about the smallest components of our world.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 7:06 am on July 3, 2017 Permalink | Reply
    Tags: and the Big Bang?, , , , Can faster-than-light particles explain dark matter, , , Dark Energy, Tachyons   

    From COSMOS: “Can faster-than-light particles explain dark matter, dark energy, and the Big Bang?” 

    Cosmos Magazine bloc

    COSMOS

    30 June 2017
    Robyn Arianrhod

    1
    Tachyons may explain dark matter, dark energy and the black holes at the core of many galaxies. Andrzej Wojcicki / Science Photo Library / Getty.

    Here are six big questions about our universe that current physics can’t answer:

    What is dark energy, the mysterious energy that appears to be accelerating the expansion of the universe?
    What is dark matter, the invisible substance we can only detect by its gravitational effect on stars and galaxies?
    What caused inflation, the blindingly fast expansion of the universe immediately after the Big Bang?
    For that matter, what caused the Big Bang?
    Are there many possible Big Bangs or universes?
    Is there a telltale characteristic associated with the death of a universe?

    Despite the efforts of some of the world’s brightest brains, the Standard Model of particle physics – our current best theory of how the universe works at a fundamental level – has no solution to these stumpers.

    A compelling new theory claims to solve all six in a single sweep. The answer, according to a paper published in European Physical Journal C by Herb Fried from Brown University and Yves Gabellini from INLN-Université de Nice, may be a kind of particle called a tachyon.

    Tachyons are hypothetical particles that travel faster than light. According to Einstein’s special theory of relativity – and according to experiment so far – in our ‘real’ world, particles can never travel faster than light. Which is just as well: if they did, our ideas about cause and effect would be thrown out the window, because it would be possible to see an effect manifest before its cause.

    Although it is elegantly simple in conception, Fried and Gabellini’s model is controversial because it requires the existence of these tachyons: specifically electrically charged, fermionic tachyons and anti-tachyons, fluctuating as virtual particles in the quantum vacuum (QV). (The idea of virtual particles per se is nothing new: in the Standard Model, forces like electromagnetism are regarded as fields of virtual particles constantly ducking in and out of existence. Taken together, all these virtual particles make up the quantum vacuum.)

    But special relativity, though it bars faster-than-light travel for ordinary matter and photons, does not entirely preclude the existence of tachyons. As Fried explains, “In the presence of a huge-energy event, such as a supernova explosion or the Big Bang itself, perhaps these virtual tachyons can be torn out of the QV and sent flying into the real vacuum (RV) of our everyday world, as real particles that have yet to be measured.”

    If these tachyons do cross the speed-of-light boundary, the researchers believe that their high masses and small distances of interaction would introduce into our world an immeasurably small amount of ‘a-causality’.

    Fried and Gabellini arrived at their tachyon-based model while trying to find an explanation for the dark energy throughout space that appears to fuel the accelerating expansion of the universe. They first proposed that dark energy is produced by fluctuations of virtual pairs of electrons and positrons.

    However, this model ran into mathematical difficulties with unexpected imaginary numbers. In special relativity, however, the rest mass of a tachyon is an imaginary number, unlike the rest mass of ordinary particles. While the equations and imaginary numbers in the new model involve far more than simple masses, the idea is suggestive: Gabellini realized that by including fluctuating pairs of tachyons and anti-tachyons he and Fried could cancel and remove the unwanted imaginary numbers from their calculations. What is more, a huge bonus followed from this creative response to mathematical necessity: Gabellini and Fried realized that by adding their tachyons to the model, they could explain inflation too.

    “This assumption [of fluctuating tachyon-anti-tachyon pairs] cannot be negated by any experimental test,” says Fried – and the model fits beautifully with existing experimental data on dark energy and inflation energy.

    Of course, both Fried and Gabellini recognize that many physicists are wary of theories based on such radical assumptions.

    But, taken as a whole, their model suggests the possibility of a unifying mechanism that gives rise not only to inflation and dark energy, but also to dark matter. Calculations suggest that these high-energy tachyons would re-absorb almost all of the photons they emit and hence be invisible.

    And there is more: as Fried explains, “If a very high-energy tachyon flung into the real vacuum (RV) were then to meet and annihilate with an anti-tachyon of the same species, this tiny quantum ‘explosion’ of energy could be the seed of another Big Bang, giving rise to a new universe. That ‘seed’ would be an energy density, at that spot of annihilation, which is so great that a ‘tear’ occurs in the surface separating the Quantum Vacuum from the RV, and the huge energies stored in the QV are able to blast their way into the RV, producing the Big Bang of a new universe. And over the course of multiple eons, this situation could happen multiple times.”

    This model – like any model of such non-replicable phenomena as the creation of the universe – may be simply characterized as a tantalizing set of speculations. Nevertheless, it not only fits with data on inflation and dark energy, but also offers a possible solution to yet another observed mystery.

    Within the last few years, astronomers have realized that the black hole at the centre of our Milky Way galaxy is ‘supermassive’, containing the mass of a million or more suns. And the same sort of supermassive black hole (SMBH) may be seen at the centres of many other galaxies in our current universe.

    Exactly how such objects form is still an open question. The energy stored in the QV is normally large enough to counteract the gravitational tendency of galaxies to collapse in on themselves. In the theory of Fried and Gabellini, however, when a new universe forms, a huge amount of the QV energy from the old universe escapes through the ‘tear’ made by the tachyon-anti-tachyon annihilation (the new Big Bang). Eventually, even faraway parts of the old universe will be affected, as the old universe’s QV energy leaks into the new universe like air escaping through a hole in a balloon. The decrease in this QV-energy buffer against gravity in the old universe suggests that as the old universe dies, many of its galaxies will form SMBHs in the new universe, each containing the mass of the old galaxy’s former suns and planets. Some of these new SMBHs may form the centres of new galaxies in the new universe.

    “This may not be a very pleasant picture,” says Fried, speaking of the possible fate of our own universe. “But it is at least scientifically consistent.”

    And in the weird, untestable world of Big Bangs and multiple universes, consistency may be the best we can hope for.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 9:44 am on June 14, 2017 Permalink | Reply
    Tags: , , , , Dark Energy, Human waste used as biosolids for fertilizer, Macdonald campus in Ste-Anne-de-Bellevue, , McGill gets $3 million to fund research into cutting greenhouse gases, Mitigating greenhouse gas emissions caused by water and fertilizer use in agriculture,   

    From McGill via Montreal Gazette: “McGill gets $3 million to fund research into cutting greenhouse gases” 

    McGill University

    McGill University

    1

    Montreal Gazette

    June 14, 2017
    John Meagher

    2
    McGill professor Grant Clark displays human waste used as biosolids for fertilizer, on test fields at Macdonald campus on Monday. The federal government is investing in the university to conduct research on greenhouse gas mitigation in agriculture. Pierre Obendrauf / Montreal Gazette

    McGill University researchers at Macdonald campus in Ste-Anne-de-Bellevue got some welcome news Monday when the federal government announced nearly $3 million in funding for research projects that will help farmers cut greenhouse gas emissions.

    Local Liberal MP Francis Scarpaleggia and Jean-Claude Poissant, Parliamentary Secretary for the Minister of Agriculture, announced $2.9 million in funding at a press conference for two McGill projects aimed at mitigating greenhouse gas emissions caused by water and fertilizer use in agriculture.

    Scarpaleggia said the funding will “enable our agricultural sector to be a world leader and to develop new clean technologies and practices to enhance the economic and environmental sustainability of Canadian farms.”

    A project led by Prof. Chandra Madramootoo, of McGill’s Department of Bioresource Engineering, will receive more than $1.6 million to study the effects of different water management systems in Eastern Canada.

    The aim is to provide information on water-management practices that reduce greenhouse gas emissions while increasing agricultural productivity.

    The second project, headed by McGill Prof. Grant Clark, also of the Department of Bioresource Engineering, will receive $1.3 million. The project will research best management practices for the use of municipal bio-solids, a by-product of wastewater treatment plants, as a crop fertilizer.

    “I’m a firm believer in science-based policy,” Clark said. “And we require the support of government to develop the knowledge to promote that policy.

    “I would also like to acknowledge the government’s support of real concrete action to (address) climate change and reduce greenhouse gas emissions.”

    Clark said the research project will examine how to “reduce, reuse, recycle, reclaim” the use of nutrients and organics in agriculture

    “If were are going to develop a sustainable agricultural system, we must be conscious of how we conserve resources, reduce inputs as well as reduce greenhouse gas emissions and build and preserve the health of our soils,” he said.

    “We are interested in linking the intensive food production required to support a growing global population with the recycling of organic wastes from our municipal centres,” Clark added.

    “The objective of the program is to use the residual solids from the treatment of municipal waste waters, or biosolids, as fertilizers for agricultural production. So this mirrors the natural cycling of nutrients or organic carbon that we see in nature. However, we can’t just go out and poop in the field. The cycle is a little more involved in order that we preserve public health and hygiene.”

    Scarpaleggia described the research work being done at the Macdonald campus in Ste-Anne as “world class.”

    “The federal government has always recognized the enormous value of Macdonald campus as a world-class research facility,” said the MP for Lac-St-Louis riding.

    “They’re doing groundbreaking work here in any areas of agriculture, including water management, which is a particular interest of mine. So it’s very important to channel some research funds to Macdonald campus.”

    Scarpaleggia said the McGill projects being funded by federal government will promote job growth in the green economy.

    “As we move ahead with climate change policies, we are, as a consequence, stimulating research, stimulating industrial innovation. We’re making that jump to the green economy with all its benefits in terms of employment and high value-added jobs.”

    The federal funding, which comes from the Agricultural Greenhouse Gases Program (AGGP), was made on behalf of Lawrence MacAuley, the Minister of Agriculture and Agri-Food Canada.

    “The Government of Canada continues to invest in research with partners like McGill University in order to provide our farmers with the best strategies for adapting to climate change and for producing more quality food for a growing population while keeping agriculture clean and sustainable,” said Poissant.

    The AGGP is $27-million initiative aimed at helping the agricultural sector adjust to climate change and improve soil and water conservation. McGill’s agronomists and scientists are involved in 20 new research projects being conducted across Canada, from British Columbia to the Maritimes.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    All about McGill

    With some 300 buildings, more than 38,500 students and 250,000 living alumni, and a reputation for excellence that reaches around the globe, McGill has carved out a spot among the world’s greatest universities.
    Founded in Montreal, Quebec, in 1821, McGill is a leading Canadian post-secondary institution. It has two campuses, 11 faculties, 11 professional schools, 300 programs of study and some 39,000 students, including more than 9,300 graduate students. McGill attracts students from over 150 countries around the world, its 8,200 international students making up 21 per cent of the student body.

     
  • richardmitnick 2:16 pm on June 10, 2017 Permalink | Reply
    Tags: , , , , Dark Energy, , , The largest virtual Universe ever simulated, U Zürich   

    From U Zürich: “The largest virtual Universe ever simulated.” 

    University of Zürich

    9 June 2017
    Contact
    Prof. Dr. Romain Teyssier
    romain.teyssier@uzh.ch
    Institute for Computational Science
    University of Zurich
    +41 44 635 60 20

    Dr. Joachim Stadel
    stadel@physik.uzh.ch
    Institute for Computational Science
    University of Zurich
    Phone: +41 44 635 58 16

    Researchers from the University of Zürich have simulated the formation of our entire Universe with a large supercomputer. A gigantic catalogue of about 25 billion virtual galaxies has been generated from 2 trillion digital particles. This catalogue is being used to calibrate the experiments on board the Euclid satellite, that will be launched in 2020 with the objective of investigating the nature of dark matter and dark energy.

    ESA/Euclid spacecraft

    1
    The Cosmic Web: A section of the virtual universe, a billion light years across, showing how dark matter is distributed in space, with dark matter halos the yellow clumps, interconnected by dark filaments. Cosmic void, shown as the white areas, are the lowest density regions in the Universe. (Image: Joachim Stadel, UZH)

    Over a period of three years, a group of astrophysicists from the University of Zürich has developed and optimised a revolutionary code to describe with unprecedented accuracy the dynamics of dark matter and the formation of large-scale structures in the Universe. As Joachim Stadel, Douglas Potter and Romain Teyssier report in their recently published paper [Computational Astrophysics and Cosmology], the code (called PKDGRAV3) has been designed to use optimally the available memory and processing power of modern supercomputing architectures, such as the “Piz Daint” supercomputer of the Swiss National Computing Center (CSCS). The code was executed on this world-leading machine for only 80 hours, and generated a virtual universe of two trillion (i.e., two thousand billion or 2 x 1012) macro-particles representing the dark matter fluid, from which a catalogue of 25 billion virtual galaxies was extracted.

    Cray Piz Daint supercomputer of the Swiss National Supercomputing Center (CSCS)

    Studying the composition of the dark universe

    Thanks to the high precision of their calculation, featuring a dark matter fluid evolving under its own gravity, the researchers have simulated the formation of small concentration of matter, called dark matter halos, in which we believe galaxies like the Milky Way form.

    Caterpillar Project A Milky-Way-size dark-matter halo and its subhalos circled, an enormous suite of simulations . Griffen et al. 2016

    The challenge of this simulation was to model galaxies as small as one tenth of the Milky Way, in a volume as large as our entire observable Universe. This was the requirement set by the European Euclid mission, whose main objective is to explore the dark side of the Universe.

    Measuring subtle distortions

    Indeed, about 95 percent of the Universe is dark. The cosmos consists of 23 percent of dark matter and 72 percent of dark energy. “The nature of dark energy remains one of the main unsolved puzzles in modern science,” says Romain Teyssier, UZH professor for computational astrophysics.

    Earthbound science of Dark Energy

    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam

    A puzzle that can be cracked only through indirect observation: When the Euclid satellite will capture the light of billions of galaxies in large areas of the sky, astronomers will measure very subtle distortions that arise from the deflection of light of these background galaxies by a foreground, invisible distribution of mass – dark matter. “That is comparable to the distortion of light by a somewhat uneven glass pane,” says Joachim Stadel from the Institute for Computational Science of the UZH.

    Optimizing observation strategies of the satellite

    This new virtual galaxy catalogue will help optimize the observational strategy of the Euclid experiment and minimize various sources of error, before the satellite embarks on its six-year data collecting mission in 2020. “Euclid will perform a tomographic map of our Universe, tracing back in time more than 10-billion-year of evolution in the cosmos,” Stadel says. From the Euclid data, researchers will obtain new information on the nature of this mysterious dark energy, but also hope to discover new physics beyond the standard model, such as a modified version of general relativity or a new type of particle.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The University of Zürich (UZH, German: Universität Zürich), located in the city of Zürich, is the largest university in Switzerland, with over 26,000 students. It was founded in 1833 from the existing colleges of theology, law, medicine and a new faculty of philosophy.

    Currently, the university has seven faculties: Philosophy, Human Medicine, Economic Sciences, Law, Mathematics and Natural Sciences, Theology and Veterinary Medicine. The university offers the widest range of subjects and courses of any Swiss higher education institutions.

     
  • richardmitnick 8:31 am on May 29, 2017 Permalink | Reply
    Tags: , Dark Energy, , Harnessing the energy generated when freshwater meets saltwater, ,   

    From Penn State via phys.org: “Harnessing the energy generated when freshwater meets saltwater” 

    Penn State Bloc

    Pennsylvania State University

    phys.org

    May 29, 2017
    Jennifer Matthews

    2
    Credit: Pennsylvania State University

    Penn State researchers have created a new hybrid technology that produces unprecedented amounts of electrical power where seawater and freshwater combine at the coast.

    “The goal of this technology is to generate electricity from where the rivers meet the ocean,” said Christopher Gorski, assistant professor in environmental engineering at Penn State. “It’s based on the difference in the salt concentrations between the two water sources.”

    That difference in salt concentration has the potential to generate enough energy to meet up to 40 percent of global electricity demands. Though methods currently exist to capture this energy, the two most successful methods, pressure retarded osmosis (PRO) and reverse electrodialysis (RED), have thus far fallen short.

    PRO, the most common system, selectively allows water to transport through a semi-permeable membrane, while rejecting salt. The osmotic pressure created from this process is then converted into energy by turning turbines.

    “PRO is so far the best technology in terms of how much energy you can get out,” Gorski said. “But the main problem with PRO is that the membranes that transport the water through foul, meaning that bacteria grows on them or particles get stuck on their surfaces, and they no longer transport water through them.”

    This occurs because the holes in the membranes are incredibly small, so they become blocked easily. In addition, PRO doesn’t have the ability to withstand the necessary pressures of super salty waters.

    The second technology, RED, uses an electrochemical gradient to develop voltages across ion-exchange membranes.

    “Ion exchange membranes only allow either positively charged ions to move through them or negatively charged ions,” Gorski explained. “So only the dissolved salt is going through, and not the water itself.”

    Here, the energy is created when chloride or sodium ions are kept from crossing ion-exchange membranes as a result of selective ion transport. Ion-exchange membranes don’t require water to flow through them, so they don’t foul as easily as the membranes used in PRO; however, the problem with RED is that it doesn’t have the ability to produce large amounts of power.

    3
    Photograph of the concentration flow cell. Two plates clamp the cell together, which contains two narrow channels fed with either synthetic freshwater or seawater through the plastic lines. Credit: Pennsylvania State University

    A third technology, capacitive mixing (CapMix), is a relatively new method also being explored. CapMix is an electrode-based technology that captures energy from the voltage that develops when two identical electrodes are sequentially exposed to two different kinds of water with varying salt concentrations, such as freshwater and seawater. Like RED, the problem with CapMix is that it’s not able to yield enough power to be viable.

    Gorski, along with Bruce Logan, Evan Pugh Professor and the Stan and Flora Kappe Professor of Environmental Engineering, and Taeyoung Kim, post-doctoral scholar in environmental engineering, may have found a solution to these problems. The researchers have combined both the RED and CapMix technologies in an electrochemical flow cell.

    “By combining the two methods, they end up giving you a lot more energy,” Gorski said.

    The team constructed a custom-built flow cell in which two channels were separated by an anion-exchange membrane. A copper hexacyanoferrate electrode was then placed in each channel, and graphite foil was used as a current collector. The cell was then sealed using two end plates with bolts and nuts. Once built, one channel was fed with synthetic seawater, while the other channel was fed with synthetic freshwater. Periodically switching the water’s flow paths allowed the cell to recharge and further produce power. From there, they examined how the cutoff voltage used for switching flow paths, external resistance and salt concentrations influenced peak and average power production.

    “There are two things going on here that make it work,” said Gorski. “The first is you have the salt going to the electrodes. The second is you have the chloride transferring across the membrane. Since both of these processes generate a voltage, you end up developing a combined voltage at the electrodes and across the membrane.”

    To determine the gained voltage of the flow cell depending on the type of membrane used and salinity difference, the team recorded open-circuit cell voltages while feeding two solutions at 15 milliliters per minute. Through this method, they identified that stacking multiple cells did influence electricity production. At 12.6 watts per square meter, this technology leads to peak power densities that are unprecedentedly high compared to previously reported RED (2.9 watts per square meter), and on par with the maximum calculated values for PRO (9.2 watts per square meter), but without the fouling problems.

    “What we’ve shown is that we can bring that power density up to what people have reported for pressure retarded osmosis and to a value much higher that what has been reported if you use these two processes alone,” Gorski said.

    Though the results are promising, the researchers want to do more research on the stability of the electrodes over time and want to know how other elements in seawater— like magnesium and sulfate— might affect the performance of the cell.

    “Pursuing renewable energy sources is important,” Gorski said. “If we can do carbon neutral energy, we should.”

    No science paper referenced.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Penn State Campus

    WHAT WE DO BEST

    We teach students that the real measure of success is what you do to improve the lives of others, and they learn to be hard-working leaders with a global perspective. We conduct research to improve lives. We add millions to the economy through projects in our state and beyond. We help communities by sharing our faculty expertise and research.

    Penn State lives close by no matter where you are. Our campuses are located from one side of Pennsylvania to the other. Through Penn State World Campus, students can take courses and work toward degrees online from anywhere on the globe that has Internet service.

    We support students in many ways, including advising and counseling services for school and life; diversity and inclusion services; social media sites; safety services; and emergency assistance.

    Our network of more than a half-million alumni is accessible to students when they want advice and to learn about job networking and mentor opportunities as well as what to expect in the future. Through our alumni, Penn State lives all over the world.

    The best part of Penn State is our people. Our students, faculty, staff, alumni, and friends in communities near our campuses and across the globe are dedicated to education and fostering a diverse and inclusive environment.

     
  • richardmitnick 9:41 pm on May 18, 2017 Permalink | Reply
    Tags: , , , Dark Energy, , , ,   

    From Nautilus: “The Physicist Who Denies Dark Matter” Revised and Improved from post of 2017/03/01 

    Nautilus

    Nautilus

    May 18, 2017
    Oded Carmeli

    4
    Mordehai Milgrom. Cosmos on Nautilus

    Maybe Newtonian physics doesn’t need dark matter to work.

    He is one of those dark matter people,” Mordehai Milgrom said about a colleague stopping by his office at the Weizmann Institute of Science. Milgrom introduced us, telling me that his friend is searching for evidence of dark matter in a project taking place just down the hall.

    “There are no ‘dark matter people’ and ‘MOND people,’ ” his colleague retorted.

    7
    http://www.astro.umd.edu/~ssm/mond/

    “I am ‘MOND people,’” Milgrom proudly proclaimed, referring to Modified Newtonian Dynamics, his theory that fixes Newtonian physics instead of postulating the existence of dark matter and dark energy—two things that, according to the standard model of cosmology, constitute 95.1 percent of the total mass-energy content of the universe.

    This friendly incident is indicative of (“Moti”) Milgrom’s calmly quixotic character. There is something almost misleading about the 70-year-old physicist wearing shorts in the hot Israeli summer, whose soft voice breaks whenever he gets excited. Nothing about his pleasant demeanor reveals that this man claims to be the third person to correct Newtonian physics: First Max Planck (with quantum theory), then Einstein (with relativity), now Milgrom.

    This year marks Milgrom’s 50th year at the Weizmann.


    Weizmann Institute Campus

    I visited him there to learn more about how it feels to be a science maverick, what he appreciates about Thomas Kuhn’s The Structure of Scientific Revolutions, and why he thinks dark matter and dark energy don’t exist.

    1
    NASA

    What inspired you to dedicate your life to the motion of stars?

    I remember very vividly the way physics struck me. I was 16 and I thought: Here is a way to understand how things work, far beyond the understanding of my peers. It wasn’t a long-term plan. It was a daily attraction. I simply loved physics, the same way other people love art or sports. I never dreamed of one day making a major discovery, like correcting Newton.

    I had a terrific physics teacher at school, but when you study textbook material, you’re studying done deals. You still don’t see the effort that goes into making breakthrough science, when things are unclear and advances are made intuitively and often go wrong. They don’t teach you that at school. They teach you that science always goes forward: You have a body of knowledge, and then someone discovers something and expands that body of knowledge. But it doesn’t really work that way. The progress of science is never linear.

    How did you get involved with the problem of dark matter?

    Toward the end of my Ph.D., the physics department here wanted to expand. So they asked three top Ph.D. students working on particle physics to choose a new field. We chose astrophysics, and the Weizmann Institute pulled some strings with institutions abroad so they would accept us as postdocs. And so I went to Cornell to fill my gaps in astrophysics.

    After a few years in high energy astrophysics, working on the physics of X-ray radiation in space, I decided to move to yet another field: The dynamics of galaxies. It was a few years after the first detailed measurements of the speed of stars orbiting spiral galaxies came in. And, well, there was a problem with the measurements.

    To understand this problem, one needs to wrap one’s head around some celestial rotations. Our planet orbits the sun, which, in turn, orbits the center of the Milky Way galaxy. Inside solar systems, the gravitational pull from the mass of the sun and the speed of the planets are in balance. By Newton’s laws, this is why Mercury, the innermost planet in our solar system, orbits the sun at over 100,000 miles per hour, while the outermost plant, Neptune, is crawling at just over 10,000 miles per hour.

    Milky Way NASA/JPL-Caltech /ESO R. Hurt

    Now, you might assume that the same logic would apply to galaxies: The farther away the star is from the galaxy’s center, the slower it revolves around it; however, while at smaller radiuses the measurements were as predicted by Newtonian physics, farther stars proved to move much faster than predicted from the gravitational pull of the mass we see in these galaxies. The observed gap got a lot wider when, in the late 1970s, radio telescopes were able to detect and measure the cold gas clouds at the outskirts of galaxies. These clouds orbit the galactic center five times farther than the stars, and thus the anomaly grew to become a major scientific puzzle.

    One way to solve this puzzle is to simply add more matter. If there is too little visible mass at the center of galaxies to account for the speed of stars and gas, perhaps there is more matter than meets the eye, matter that we cannot see, dark matter.

    What made you first question the very existence of dark matter?

    What struck me was some regularity in the anomaly. The rotational velocities were not just larger than expected, they became constant with radius. Why? Sure, if there was dark matter, the speed of stars would be greater, but the rotation curves, meaning the rotational speed drawn as a function of the radius, could still go up and down depending on its distribution. But they didn’t. That really struck me as odd. So, in 1980, I went on my Sabbatical in the Institute for Advance Studies in Princeton with the following hunch: If the rotational speeds are constant, then perhaps we’re looking at a new law of nature. If Newtonian physics can’t predict the fixed curves, perhaps we should fix Newton, instead of making up a whole new class of matter just to fit our measurements.

    If you’re going to change the laws of nature that work so well in our own solar system, you need to find a property that differentiates solar systems from galaxies. So I made up a chart of different properties, such as size, mass, speed of rotation, etc. For each parameter, I put in the Earth, the solar system and some galaxies. For example, galaxies are bigger than solar systems, so perhaps Newton’s laws don’t work over large distances? But if this was the case, you would expect the rotation anomaly to grow bigger in bigger galaxies, while, in fact, it is not. So I crossed that one out and moved on to the next properties.

    I finally struck gold with acceleration: The pace at which the velocity of objects changes.

    3
    NASA

    We usually think of earthbound cars that accelerate in the same direction, but imagine a merry-go-round. You could be going in circles and still accelerate. Otherwise, you would simply fall off. The same goes for celestial merry-go-rounds. And it’s in acceleration that we find a big difference in scales, one that justifies modifying Newton: The normal acceleration for a star orbiting the center of a galaxy is about a hundred million times smaller than that of the Earth orbiting the sun.

    For those small accelerations, MOND introduces a new constant of nature, called a0. If you studied physics in high school, you probably remember Newton’s second law: force equals mass times acceleration, or F=ma. While this is a perfectly good tool when dealing with accelerations much greater than a0, such as those of the planets around our sun, I suggested that at significantly lower accelerations, lower even than that of our sun around the galactic center, force becomes proportional to the square of the acceleration, or F=ma2/a0.

    To put it in other words: According to Newton’s laws, the rotation speed of stars around galactic centers should decrease the farther the star is from the center of mass. If MOND is correct, it should reach a constant value, thus eliminating the need for dark matter.

    What did your colleagues at Princeton think about all this?

    I didn’t share these thoughts with my colleagues at Princeton. I was afraid to come across as, well, crazy. And then, in 1981, when I already had a clear idea of MOND, I didn’t want anyone to jump on my wagon, so to speak, which is even crazier when you think about it. Needless to say [laughs] no one jumped on my wagon, even when I desperately wanted them to.

    Well, you were 35 and you proposed to fix Newton.

    Why not? What’s the big deal? If something doesn’t work, fix it. I wasn’t trying to be bold. I was very naïve at the time. I didn’t understand that scientists are just as swayed as other people by conventions and interests.

    Like Thomas Kuhn’s The Structure of Scientific Revolutions.

    10

    I love that book. I read it several times. It showed me how my life’s story has happened to so many others scientists throughout history. Sure, it’s easy to make fun of people who once objected to what we now know is good science, but are we any different? Kuhn stresses that these objectors are usually good scientists with good reasons to object. It is just that the dissenters usually have a unique point of view of things that is not shared by most others. I laugh about it now, because MOND has made such progress, but there were times when I felt depressed and isolated.

    What’s it like being a science maverick?

    By and large, the last 35 years have been exciting and rewarding exactly because I have been advocating a maverick paradigm. I am a loner by nature, and despite the daunting and doubting times, I much prefer this to being carried with the general flow. I was quite confident in the basic validity of MOND from the very start, which helped me a lot in taking all this in stride, but there are two great advantages to the lingering opposition to MOND: Firstly, it gave me time to make more contributions to MOND than I would had the community jumped on the MOND wagon early on. Secondly, once MOND is accepted, the long and wide resistance to it will only have proven how nontrivial an idea it is.

    By the end of my sabbatical in Princeton, I had secretly written three papers introducing MOND to the world. Publishing them, however, was a whole different story. At first I sent my kernel paper to journals such as Nature and Astrophysical Journal Letters, and it got rejected almost off-hand. It took a long time until all three papers were published, side by side, in Astrophysical Journal.

    The first person to hear about MOND was my wife Yvonne. Frankly, tears come to my eyes when I say this. Yvonne is not a scientist, but she has been my greatest supporter.

    The first scientist to back MOND was another physics maverick: The late Professor Jacob Bekenstein, who was the first to suggest that black holes should have a well-defined entropy, later dubbed the Bekenstein-Hawking entropy. After I submitted the initial MOND trilogy, I sent the preprints to several astrophysicists, but Jacob was the first scientist I discussed MOND with. He was enthusiastic and encouraging from the very start.

    Slowly but surely, this tiny opposition to dark matter grew from just two physicists to several hundred proponents, or at least scientists who take MOND seriously. Dark matter is still the scientific consensus, but MOND is now a formidable opponent that proclaims the emperor has no clothes, that dark matter is our generation’s ether.

    So what happened? As far as dark matter is concerned, nothing really. A host of experiments searching for dark matter, including the Large Hadron Collider, many underground experiments and several space missions, have failed to directly observe its very existence. Meanwhile, MOND was able to accurately predict the rotation of more and more spiral galaxies—over 150 galaxies to date, to be precise.

    All of them? Some papers claim that MOND wasn’t able to predict the dynamics of certain galaxies.

    That’s true and it’s perfectly fine, because MOND’s predictions are based on measurements. Given the distribution of regular, visible matter alone, MOND can predict the dynamics of galaxies. But that prediction is based on our initial measurements. We measure the light coming in from a galaxy to calculate its mass, but we often don’t know the distance to that galaxy for sure, so we don’t know for certain just how massive that galaxy really is. And there are other variables, such as molecular gas, that we can’t observe at all. So yes, some galaxies don’t perfectly match MOND’s predictions, but all in all, it’s almost a miracle that we have enough data on galaxies to prove MOND right, over and over again.

    Your opponents say MOND’s greatest flaw is its incompatibility with relativistic physics.

    In 2004, Bekenstein proposed his TeVeS, or Relativistic Gravitational Theory for MOND.

    12
    http://astroweb.case.edu/ssm/mond/

    Since then, several different relativistic MOND formulations have been put forth, including one by me, called Bimetric MOND, or BIMOND.

    So, no, incorporating MOND into Einsteinian physics is no longer a challenge. I hear this statement still made, but only from people who parrot others, who themselves are not abreast with the developments of the last 10 years. There are several relativistic versions of MOND. What remains a challenge is demonstrating that MOND can account for the mass anomalies in cosmology.

    Another argument that cosmologists often make is that dark matter is needed not just for motion within galaxies, but on even larger scales. What does MOND have to say about that?

    According to the Big Bang theory, the universe began as a uniform singularity 13.8 billion years ago. And, just as in galaxies, observations made of the cosmic background radiation from the early universe suggest that the gravity of all the matter in the universe is simply not enough to form the different patterns we currently see, like galaxies and stars, in just 13.8 billion years. Once again, dark matter was called to the rescue: It does not emit radiation, but it does engage visible material with gravitation. And so, starting from the 1980s, the new cosmological dogma was that dark matter constituted a staggering 95 percent of all matter in the universe. That lasted, well, right until the bomb hit us in 1998.

    It turned out that the expansion of the universe is accelerating, not decelerating like all of us originally thought.

    13
    Timeline of the universe, assuming a cosmological constant. Coldcreation/wikimedia, CC BY-SA

    Any form of genuine matter, dark or not, should have slowed down acceleration. And so a whole new type of entity was invented: Dark energy. Now the accepted cosmology is that the universe is made up of 70 percent dark energy, 25 percent dark matter, and 5 percent regular matter..

    Dark energy depiction. Image: Volker Springle/Max Planck Institute for Astrophysics/SP)

    But dark energy is just a quick fix, the same as dark matter is. And just as in galaxies, you can either invent a whole new type of energy and then spend years trying to understand its properties, or you can try fixing your theory.

    Among other things, MOND points to a very deep connection between structure and dynamics in galaxies and cosmology. This is not expected in accepted physics. Galaxies are tiny structures within the grand scale of the universe, and those structures can behave differently without contradicting the current cosmological consensus. However, MOND creates this connection, binding the two.

    This connection is surprising: For whatever reason, the MOND constant of a0 is close to the acceleration that characterizes the universe itself. In fact, MOND’s constant equals the speed of light squared, divided by the radius of universe.

    So, indeed, to your question, the conundrum pointed to is valid at present. MOND doesn’t have a sufficient cosmology yet, but we’re working on it. And once we fully understand MOND, I believe we’ll also fully understand the expansion of the universe, and vice versa: A new cosmological theory would explain MOND. Wouldn’t that be amazing?

    What do you think about the proposed unified theories of physics, which merge MOND with quantum mechanics?

    These all hark back to my 1999 paper on MOND as a vacuum effect, where it was pointed out that the quantum vacuum in a universe such as ours may produce MOND behavior within galaxies, with the cosmological constant appearing in the guise of the MOND acceleration constant, a0. But I am greatly gratified to see these propositions put forth, especially because they are made by people outside the traditional MOND community. It is very important that researchers from other backgrounds become interested in MOND and bring new ideas to further our understanding of its origin.

    And what if you had a unified theory of physics that explains everything? What then?

    You know, I’m not a religious person, but I often think about our tiny blue dot, and the painstaking work we physicists do here. Who knows? Perhaps somewhere out there, in one of those galaxies I spent my life researching, there already is a known unified theory of physics, with a variation of MOND built into it. But then I think: So what? We still had fun doing the math. We still had the thrill of trying to wrap our heads around the universe, even if the universe never noticed it at all.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
  • richardmitnick 8:19 am on May 17, 2017 Permalink | Reply
    Tags: , , , , Dark Energy, New Explanation for Dark Energy? Tiny Fluctuations of Time and Space,   

    From Universe Today: “New Explanation for Dark Energy? Tiny Fluctuations of Time and Space” 

    universe-today

    Universe Today

    16 May , 2017
    Matt Williams

    1
    A new study from researchers from the University of British Columbia offers a new explanation of Dark Energy. Credit: NASA

    Since the late 1920s, astronomers have been aware of the fact that the Universe is in a state of expansion. Initially predicted by Einstein’s Theory of General Relativity, this realization has gone on to inform the most widely-accepted cosmological model – the Big Bang Theory. However, things became somewhat confusing during the 1990s, when improved observations showed that the Universe’s rate of expansion has been accelerating for billions of years.

    This led to the theory of Dark Energy, a mysterious invisible force that is driving the expansion of the cosmos. Much like Dark Matter which explained the “missing mass”, it then became necessary to find this illusive energy, or at least provide a coherent theoretical framework for it. A new study from the University of British Columbia (UBC) seeks to do just that by postulating the the Universe is expanding due to fluctuations in space and time.

    The study – which was recently published in the journal Physical Review D – was led by Qingdi Wang, a PhD student with the Department of Physics and Astronomy at UBC. Under the supervisions of UBC Professor William Unruh (the man who proposed the Unruh Effect) and with assistance from Zhen Zhu (another PhD student at UBC), they provide a new take on Dark Energy.

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Credit: Alex Mittelmann

    Inflationary Universe. NASA/WMAP

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 12:44 pm on May 9, 2017 Permalink | Reply
    Tags: , , , , Dark Energy, , Detecting infrared light, , ,   

    From JPL-Caltech: “NASA Delivers Detectors for ESA’s Euclid Spacecraft” 

    NASA JPL Banner

    JPL-Caltech

    May 9, 2017
    Elizabeth Landau
    Jet Propulsion Laboratory, Pasadena, Calif.
    818-354-6425
    elizabeth.landau@jpl.nasa.gov

    Giuseppe Racca
    Euclid Project Manager
    Directorate of Science
    European Space Agency
    giuseppe.racca@esa.int

    René Laureijs
    Euclid Project Scientist
    Directorate of Science
    European Space Agency
    Rene.Laureijs@esa.int

    ESA/Euclid spacecraft

    Three detector systems for the Euclid mission, led by ESA (European Space Agency), have been delivered to Europe for the spacecraft’s near-infrared instrument. The detector systems are key components of NASA’s contribution to this upcoming mission to study some of the biggest questions about the universe, including those related to the properties and effects of dark matter and dark energy — two critical, but invisible phenomena that scientists think make up the vast majority of our universe.

    “The delivery of these detector systems is a milestone for what we hope will be an extremely exciting mission, the first space mission dedicated to going after the mysterious dark energy,” said Michael Seiffert, the NASA Euclid project scientist based at NASA’s Jet Propulsion Laboratory, Pasadena, California, which manages the development and implementation of the detector systems.

    Euclid will carry two instruments: a visible-light imager (VIS) and a near-infrared spectrometer and photometer (NISP). A special light-splitting plate on the Euclid telescope enables incoming light to be shared by both instruments, so they can carry out observations simultaneously.

    The spacecraft, scheduled for launch in 2020, will observe billions of faint galaxies and investigate why the universe is expanding at an accelerating pace. Astrophysicists think dark energy is responsible for this effect, and Euclid will explore this hypothesis and help constrain dark energy models. This census of distant galaxies will also reveal how galaxies are distributed in our universe, which will help astrophysicists understand how the delicate interplay of the gravity of dark matter, luminous matter and dark energy forms large-scale structures in the universe.

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Additionally, the location of galaxies in relation to each other tells scientists how they are clustered. Dark matter, an invisible substance accounting for over 80 percent of matter in our universe, can cause subtle distortions in the apparent shapes of galaxies. That is because its gravity bends light that travels from a distant galaxy toward an observer, which changes the appearance of the galaxy when it is viewed from a telescope.

    Gravitational Lensing NASA/ESA

    Euclid’s combination of visible and infrared instruments will examine this distortion effect and allow astronomers to probe dark matter and the effects of dark energy.

    Detecting infrared light, which is invisible to the human eye, is especially important for studying the universe’s distant galaxies. Much like the Doppler effect for sound, where a siren’s pitch seems higher as it approaches and lower as it moves away, the frequency of light from an astronomical object gets shifted with motion. Light from objects that are traveling away from us appears redder, and light from those approaching us appears bluer. Because the universe is expanding, distant galaxies are moving away from us, so their light gets stretched out to longer wavelengths. Between 6 and 10 billion light-years away, galaxies are brightest in infrared light.

    JPL procured the NISP detector systems, which were manufactured by Teledyne Imaging Sensors of Camarillo, California. They were tested at JPL and at NASA’s Goddard Space Flight Center, Greenbelt, Maryland, before being shipped to France and the NISP team.

    Each detector system consists of a detector, a cable and a “readout electronics chip” that converts infrared light to data signals read by an onboard computer and transmitted to Earth for analysis. Sixteen detectors will fly on Euclid, each composed of 2040 by 2040 pixels. They will cover a field of view slightly larger than twice the area covered by a full moon. The detectors are made of a mercury-cadmium-telluride mixture and are designed to operate at extremely cold temperatures.

    “The U.S. Euclid team has overcome many technical hurdles along the way, and we are delivering superb detectors that will enable the collection of unprecedented data during the mission,” said Ulf Israelsson, the NASA Euclid project manager, based at JPL.

    Delivery to ESA of the next set of detectors for NISP is planned in early June. The Centre de Physique de Particules de Marseille, France, will provide further characterization of the detector systems. The final detector focal plane will then be assembled at the Laboratoire d’Astrophysique de Marseille, and integrated with the rest of NISP for instrument tests.

    For more information about Euclid, visit:

    http://sci.esa.int/Euclid

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA JPL Campus

    Jet Propulsion Laboratory (JPL) is a federally funded research and development center and NASA field center located in the San Gabriel Valley area of Los Angeles County, California, United States. Although the facility has a Pasadena postal address, it is actually headquartered in the city of La Cañada Flintridge [1], on the northwest border of Pasadena. JPL is managed by the nearby California Institute of Technology (Caltech) for the National Aeronautics and Space Administration. The Laboratory’s primary function is the construction and operation of robotic planetary spacecraft, though it also conducts Earth-orbit and astronomy missions. It is also responsible for operating NASA’s Deep Space Network.

    Caltech Logo

    NASA image

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: